Dec 01 08:35:57 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Dec 01 08:35:57 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 01 08:35:57 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 01 08:35:57 localhost kernel: BIOS-provided physical RAM map:
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 01 08:35:57 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 01 08:35:57 localhost kernel: NX (Execute Disable) protection: active
Dec 01 08:35:57 localhost kernel: APIC: Static calls initialized
Dec 01 08:35:57 localhost kernel: SMBIOS 2.8 present.
Dec 01 08:35:57 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 01 08:35:57 localhost kernel: Hypervisor detected: KVM
Dec 01 08:35:57 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 01 08:35:57 localhost kernel: kvm-clock: using sched offset of 3487215260 cycles
Dec 01 08:35:57 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 01 08:35:57 localhost kernel: tsc: Detected 2800.000 MHz processor
Dec 01 08:35:57 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 01 08:35:57 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 01 08:35:57 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 01 08:35:57 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 01 08:35:57 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 01 08:35:57 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 01 08:35:57 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 01 08:35:57 localhost kernel: Using GB pages for direct mapping
Dec 01 08:35:57 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Dec 01 08:35:57 localhost kernel: ACPI: Early table checksum verification disabled
Dec 01 08:35:57 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 01 08:35:57 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 08:35:57 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 08:35:57 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 08:35:57 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 01 08:35:57 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 08:35:57 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 08:35:57 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 01 08:35:57 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 01 08:35:57 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 01 08:35:57 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 01 08:35:57 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 01 08:35:57 localhost kernel: No NUMA configuration found
Dec 01 08:35:57 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 01 08:35:57 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec 01 08:35:57 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 01 08:35:57 localhost kernel: Zone ranges:
Dec 01 08:35:57 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 01 08:35:57 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 01 08:35:57 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 01 08:35:57 localhost kernel:   Device   empty
Dec 01 08:35:57 localhost kernel: Movable zone start for each node
Dec 01 08:35:57 localhost kernel: Early memory node ranges
Dec 01 08:35:57 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 01 08:35:57 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 01 08:35:57 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 01 08:35:57 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 01 08:35:57 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 01 08:35:57 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 01 08:35:57 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 01 08:35:57 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 01 08:35:57 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 01 08:35:57 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 01 08:35:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 01 08:35:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 01 08:35:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 01 08:35:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 01 08:35:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 01 08:35:57 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 01 08:35:57 localhost kernel: TSC deadline timer available
Dec 01 08:35:57 localhost kernel: CPU topo: Max. logical packages:   8
Dec 01 08:35:57 localhost kernel: CPU topo: Max. logical dies:       8
Dec 01 08:35:57 localhost kernel: CPU topo: Max. dies per package:   1
Dec 01 08:35:57 localhost kernel: CPU topo: Max. threads per core:   1
Dec 01 08:35:57 localhost kernel: CPU topo: Num. cores per package:     1
Dec 01 08:35:57 localhost kernel: CPU topo: Num. threads per package:   1
Dec 01 08:35:57 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 01 08:35:57 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 01 08:35:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 01 08:35:57 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 01 08:35:57 localhost kernel: Booting paravirtualized kernel on KVM
Dec 01 08:35:57 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 01 08:35:57 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 01 08:35:57 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 01 08:35:57 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 01 08:35:57 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 01 08:35:57 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 01 08:35:57 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 01 08:35:57 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Dec 01 08:35:57 localhost kernel: random: crng init done
Dec 01 08:35:57 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 01 08:35:57 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 01 08:35:57 localhost kernel: Fallback order for Node 0: 0 
Dec 01 08:35:57 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 01 08:35:57 localhost kernel: Policy zone: Normal
Dec 01 08:35:57 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 01 08:35:57 localhost kernel: software IO TLB: area num 8.
Dec 01 08:35:57 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 01 08:35:57 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Dec 01 08:35:57 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 01 08:35:57 localhost kernel: Dynamic Preempt: voluntary
Dec 01 08:35:57 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 01 08:35:57 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 01 08:35:57 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 01 08:35:57 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 01 08:35:57 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 01 08:35:57 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 01 08:35:57 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 01 08:35:57 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 01 08:35:57 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 01 08:35:57 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 01 08:35:57 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 01 08:35:57 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 01 08:35:57 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 01 08:35:57 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 01 08:35:57 localhost kernel: Console: colour VGA+ 80x25
Dec 01 08:35:57 localhost kernel: printk: console [ttyS0] enabled
Dec 01 08:35:57 localhost kernel: ACPI: Core revision 20230331
Dec 01 08:35:57 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 01 08:35:57 localhost kernel: x2apic enabled
Dec 01 08:35:57 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 01 08:35:57 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 01 08:35:57 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec 01 08:35:57 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 01 08:35:57 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 01 08:35:57 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 01 08:35:57 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 01 08:35:57 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 01 08:35:57 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 01 08:35:57 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 01 08:35:57 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 01 08:35:57 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 01 08:35:57 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 01 08:35:57 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 01 08:35:57 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 01 08:35:57 localhost kernel: x86/bugs: return thunk changed
Dec 01 08:35:57 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 01 08:35:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 01 08:35:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 01 08:35:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 01 08:35:57 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 01 08:35:57 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 01 08:35:57 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 01 08:35:57 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 01 08:35:57 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 01 08:35:57 localhost kernel: landlock: Up and running.
Dec 01 08:35:57 localhost kernel: Yama: becoming mindful.
Dec 01 08:35:57 localhost kernel: SELinux:  Initializing.
Dec 01 08:35:57 localhost kernel: LSM support for eBPF active
Dec 01 08:35:57 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 01 08:35:57 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 01 08:35:57 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 01 08:35:57 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 01 08:35:57 localhost kernel: ... version:                0
Dec 01 08:35:57 localhost kernel: ... bit width:              48
Dec 01 08:35:57 localhost kernel: ... generic registers:      6
Dec 01 08:35:57 localhost kernel: ... value mask:             0000ffffffffffff
Dec 01 08:35:57 localhost kernel: ... max period:             00007fffffffffff
Dec 01 08:35:57 localhost kernel: ... fixed-purpose events:   0
Dec 01 08:35:57 localhost kernel: ... event mask:             000000000000003f
Dec 01 08:35:57 localhost kernel: signal: max sigframe size: 1776
Dec 01 08:35:57 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 01 08:35:57 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 01 08:35:57 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 01 08:35:57 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 01 08:35:57 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 01 08:35:57 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 01 08:35:57 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec 01 08:35:57 localhost kernel: node 0 deferred pages initialised in 16ms
Dec 01 08:35:57 localhost kernel: Memory: 7765960K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616272K reserved, 0K cma-reserved)
Dec 01 08:35:57 localhost kernel: devtmpfs: initialized
Dec 01 08:35:57 localhost kernel: x86/mm: Memory block size: 128MB
Dec 01 08:35:57 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 01 08:35:57 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec 01 08:35:57 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 01 08:35:57 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 01 08:35:57 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 01 08:35:57 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 01 08:35:57 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 01 08:35:57 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 01 08:35:57 localhost kernel: audit: type=2000 audit(1764578155.296:1): state=initialized audit_enabled=0 res=1
Dec 01 08:35:57 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 01 08:35:57 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 01 08:35:57 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 01 08:35:57 localhost kernel: cpuidle: using governor menu
Dec 01 08:35:57 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 01 08:35:57 localhost kernel: PCI: Using configuration type 1 for base access
Dec 01 08:35:57 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 01 08:35:57 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 01 08:35:57 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 01 08:35:57 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 01 08:35:57 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 01 08:35:57 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 01 08:35:57 localhost kernel: Demotion targets for Node 0: null
Dec 01 08:35:57 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 01 08:35:57 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 01 08:35:57 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 01 08:35:57 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 01 08:35:57 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 01 08:35:57 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 01 08:35:57 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 01 08:35:57 localhost kernel: ACPI: Interpreter enabled
Dec 01 08:35:57 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 01 08:35:57 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 01 08:35:57 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 01 08:35:57 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 01 08:35:57 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 01 08:35:57 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 01 08:35:57 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [3] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [4] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [5] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [6] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [7] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [8] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [9] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [10] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [11] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [12] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [13] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [14] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [15] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [16] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [17] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [18] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [19] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [20] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [21] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [22] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [23] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [24] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [25] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [26] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [27] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [28] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [29] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [30] registered
Dec 01 08:35:57 localhost kernel: acpiphp: Slot [31] registered
Dec 01 08:35:57 localhost kernel: PCI host bridge to bus 0000:00
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 01 08:35:57 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 01 08:35:57 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 01 08:35:57 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 01 08:35:57 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 01 08:35:57 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 01 08:35:57 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 01 08:35:57 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 01 08:35:57 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 01 08:35:57 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 01 08:35:57 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 01 08:35:57 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 01 08:35:57 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 01 08:35:57 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 01 08:35:57 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 01 08:35:57 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 01 08:35:57 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 01 08:35:57 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 01 08:35:57 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 01 08:35:57 localhost kernel: iommu: Default domain type: Translated
Dec 01 08:35:57 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 01 08:35:57 localhost kernel: SCSI subsystem initialized
Dec 01 08:35:57 localhost kernel: ACPI: bus type USB registered
Dec 01 08:35:57 localhost kernel: usbcore: registered new interface driver usbfs
Dec 01 08:35:57 localhost kernel: usbcore: registered new interface driver hub
Dec 01 08:35:57 localhost kernel: usbcore: registered new device driver usb
Dec 01 08:35:57 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 01 08:35:57 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 01 08:35:57 localhost kernel: PTP clock support registered
Dec 01 08:35:57 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 01 08:35:57 localhost kernel: NetLabel: Initializing
Dec 01 08:35:57 localhost kernel: NetLabel:  domain hash size = 128
Dec 01 08:35:57 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 01 08:35:57 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 01 08:35:57 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 01 08:35:57 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 01 08:35:57 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 01 08:35:57 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 01 08:35:57 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 01 08:35:57 localhost kernel: vgaarb: loaded
Dec 01 08:35:57 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 01 08:35:57 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 01 08:35:57 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 01 08:35:57 localhost kernel: pnp: PnP ACPI init
Dec 01 08:35:57 localhost kernel: pnp 00:03: [dma 2]
Dec 01 08:35:57 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 01 08:35:57 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 01 08:35:57 localhost kernel: NET: Registered PF_INET protocol family
Dec 01 08:35:57 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 01 08:35:57 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 01 08:35:57 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 01 08:35:57 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 01 08:35:57 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 01 08:35:57 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 01 08:35:57 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 01 08:35:57 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 01 08:35:57 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 01 08:35:57 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 01 08:35:57 localhost kernel: NET: Registered PF_XDP protocol family
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 01 08:35:57 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 01 08:35:57 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 01 08:35:57 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 01 08:35:57 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 84515 usecs
Dec 01 08:35:57 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 01 08:35:57 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 01 08:35:57 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 01 08:35:57 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 01 08:35:57 localhost kernel: ACPI: bus type thunderbolt registered
Dec 01 08:35:57 localhost kernel: Initialise system trusted keyrings
Dec 01 08:35:57 localhost kernel: Key type blacklist registered
Dec 01 08:35:57 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 01 08:35:57 localhost kernel: zbud: loaded
Dec 01 08:35:57 localhost kernel: integrity: Platform Keyring initialized
Dec 01 08:35:57 localhost kernel: integrity: Machine keyring initialized
Dec 01 08:35:57 localhost kernel: Freeing initrd memory: 85868K
Dec 01 08:35:57 localhost kernel: NET: Registered PF_ALG protocol family
Dec 01 08:35:57 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 01 08:35:57 localhost kernel: Key type asymmetric registered
Dec 01 08:35:57 localhost kernel: Asymmetric key parser 'x509' registered
Dec 01 08:35:57 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 01 08:35:57 localhost kernel: io scheduler mq-deadline registered
Dec 01 08:35:57 localhost kernel: io scheduler kyber registered
Dec 01 08:35:57 localhost kernel: io scheduler bfq registered
Dec 01 08:35:57 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 01 08:35:57 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 01 08:35:57 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 01 08:35:57 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 01 08:35:57 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 01 08:35:57 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 01 08:35:57 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 01 08:35:57 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 01 08:35:57 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 01 08:35:57 localhost kernel: Non-volatile memory driver v1.3
Dec 01 08:35:57 localhost kernel: rdac: device handler registered
Dec 01 08:35:57 localhost kernel: hp_sw: device handler registered
Dec 01 08:35:57 localhost kernel: emc: device handler registered
Dec 01 08:35:57 localhost kernel: alua: device handler registered
Dec 01 08:35:57 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 01 08:35:57 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 01 08:35:57 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 01 08:35:57 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 01 08:35:57 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 01 08:35:57 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 01 08:35:57 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 01 08:35:57 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Dec 01 08:35:57 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 01 08:35:57 localhost kernel: hub 1-0:1.0: USB hub found
Dec 01 08:35:57 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 01 08:35:57 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 01 08:35:57 localhost kernel: usbserial: USB Serial support registered for generic
Dec 01 08:35:57 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 01 08:35:57 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 01 08:35:57 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 01 08:35:57 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 01 08:35:57 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 01 08:35:57 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 01 08:35:57 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 01 08:35:57 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-01T08:35:56 UTC (1764578156)
Dec 01 08:35:57 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 01 08:35:57 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 01 08:35:57 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 01 08:35:57 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 01 08:35:57 localhost kernel: usbcore: registered new interface driver usbhid
Dec 01 08:35:57 localhost kernel: usbhid: USB HID core driver
Dec 01 08:35:57 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 01 08:35:57 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 01 08:35:57 localhost kernel: Initializing XFRM netlink socket
Dec 01 08:35:57 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 01 08:35:57 localhost kernel: Segment Routing with IPv6
Dec 01 08:35:57 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 01 08:35:57 localhost kernel: mpls_gso: MPLS GSO support
Dec 01 08:35:57 localhost kernel: IPI shorthand broadcast: enabled
Dec 01 08:35:57 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 01 08:35:57 localhost kernel: AES CTR mode by8 optimization enabled
Dec 01 08:35:57 localhost kernel: sched_clock: Marking stable (1513001650, 140242759)->(1780771269, -127526860)
Dec 01 08:35:57 localhost kernel: registered taskstats version 1
Dec 01 08:35:57 localhost kernel: Loading compiled-in X.509 certificates
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 01 08:35:57 localhost kernel: Demotion targets for Node 0: null
Dec 01 08:35:57 localhost kernel: page_owner is disabled
Dec 01 08:35:57 localhost kernel: Key type .fscrypt registered
Dec 01 08:35:57 localhost kernel: Key type fscrypt-provisioning registered
Dec 01 08:35:57 localhost kernel: Key type big_key registered
Dec 01 08:35:57 localhost kernel: Key type encrypted registered
Dec 01 08:35:57 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 01 08:35:57 localhost kernel: Loading compiled-in module X.509 certificates
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec 01 08:35:57 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 01 08:35:57 localhost kernel: ima: No architecture policies found
Dec 01 08:35:57 localhost kernel: evm: Initialising EVM extended attributes:
Dec 01 08:35:57 localhost kernel: evm: security.selinux
Dec 01 08:35:57 localhost kernel: evm: security.SMACK64 (disabled)
Dec 01 08:35:57 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 01 08:35:57 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 01 08:35:57 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 01 08:35:57 localhost kernel: evm: security.apparmor (disabled)
Dec 01 08:35:57 localhost kernel: evm: security.ima
Dec 01 08:35:57 localhost kernel: evm: security.capability
Dec 01 08:35:57 localhost kernel: evm: HMAC attrs: 0x1
Dec 01 08:35:57 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 01 08:35:57 localhost kernel: Running certificate verification RSA selftest
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 01 08:35:57 localhost kernel: Running certificate verification ECDSA selftest
Dec 01 08:35:57 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 01 08:35:57 localhost kernel: clk: Disabling unused clocks
Dec 01 08:35:57 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 01 08:35:57 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec 01 08:35:57 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 01 08:35:57 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Dec 01 08:35:57 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 01 08:35:57 localhost kernel: Run /init as init process
Dec 01 08:35:57 localhost kernel:   with arguments:
Dec 01 08:35:57 localhost kernel:     /init
Dec 01 08:35:57 localhost kernel:   with environment:
Dec 01 08:35:57 localhost kernel:     HOME=/
Dec 01 08:35:57 localhost kernel:     TERM=linux
Dec 01 08:35:57 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Dec 01 08:35:57 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 01 08:35:57 localhost systemd[1]: Detected virtualization kvm.
Dec 01 08:35:57 localhost systemd[1]: Detected architecture x86-64.
Dec 01 08:35:57 localhost systemd[1]: Running in initrd.
Dec 01 08:35:57 localhost systemd[1]: No hostname configured, using default hostname.
Dec 01 08:35:57 localhost systemd[1]: Hostname set to <localhost>.
Dec 01 08:35:57 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 01 08:35:57 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 01 08:35:57 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 01 08:35:57 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 01 08:35:57 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 01 08:35:57 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 01 08:35:57 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 01 08:35:57 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 01 08:35:57 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 01 08:35:57 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 01 08:35:57 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 01 08:35:57 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 01 08:35:57 localhost systemd[1]: Reached target Local File Systems.
Dec 01 08:35:57 localhost systemd[1]: Reached target Path Units.
Dec 01 08:35:57 localhost systemd[1]: Reached target Slice Units.
Dec 01 08:35:57 localhost systemd[1]: Reached target Swaps.
Dec 01 08:35:57 localhost systemd[1]: Reached target Timer Units.
Dec 01 08:35:57 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 01 08:35:57 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 01 08:35:57 localhost systemd[1]: Listening on Journal Socket.
Dec 01 08:35:57 localhost systemd[1]: Listening on udev Control Socket.
Dec 01 08:35:57 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 01 08:35:57 localhost systemd[1]: Reached target Socket Units.
Dec 01 08:35:57 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 01 08:35:57 localhost systemd[1]: Starting Journal Service...
Dec 01 08:35:57 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 01 08:35:57 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 01 08:35:57 localhost systemd[1]: Starting Create System Users...
Dec 01 08:35:57 localhost systemd[1]: Starting Setup Virtual Console...
Dec 01 08:35:57 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 01 08:35:57 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 01 08:35:57 localhost systemd[1]: Finished Create System Users.
Dec 01 08:35:57 localhost systemd-journald[305]: Journal started
Dec 01 08:35:57 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/523109271d304bda9d2bfd9f7cfadc4d) is 8.0M, max 153.6M, 145.6M free.
Dec 01 08:35:57 localhost systemd-sysusers[310]: Creating group 'users' with GID 100.
Dec 01 08:35:57 localhost systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Dec 01 08:35:57 localhost systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 01 08:35:57 localhost systemd[1]: Started Journal Service.
Dec 01 08:35:57 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 01 08:35:57 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 01 08:35:57 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 01 08:35:57 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 01 08:35:57 localhost systemd[1]: Finished Setup Virtual Console.
Dec 01 08:35:57 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 01 08:35:57 localhost systemd[1]: Starting dracut cmdline hook...
Dec 01 08:35:57 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Dec 01 08:35:57 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 01 08:35:57 localhost systemd[1]: Finished dracut cmdline hook.
Dec 01 08:35:57 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 01 08:35:57 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 01 08:35:57 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 01 08:35:57 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 01 08:35:57 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 01 08:35:57 localhost kernel: RPC: Registered udp transport module.
Dec 01 08:35:57 localhost kernel: RPC: Registered tcp transport module.
Dec 01 08:35:57 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 01 08:35:57 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 01 08:35:57 localhost rpc.statd[441]: Version 2.5.4 starting
Dec 01 08:35:57 localhost rpc.statd[441]: Initializing NSM state
Dec 01 08:35:57 localhost rpc.idmapd[446]: Setting log level to 0
Dec 01 08:35:57 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 01 08:35:57 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 01 08:35:57 localhost systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Dec 01 08:35:57 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 01 08:35:57 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 01 08:35:57 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 01 08:35:57 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 01 08:35:57 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 01 08:35:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 01 08:35:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 01 08:35:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 01 08:35:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 01 08:35:58 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 01 08:35:58 localhost systemd[1]: Reached target Network.
Dec 01 08:35:58 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 01 08:35:58 localhost systemd[1]: Starting dracut initqueue hook...
Dec 01 08:35:58 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 01 08:35:58 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 01 08:35:58 localhost kernel:  vda: vda1
Dec 01 08:35:58 localhost kernel: libata version 3.00 loaded.
Dec 01 08:35:58 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 01 08:35:58 localhost kernel: scsi host0: ata_piix
Dec 01 08:35:58 localhost kernel: scsi host1: ata_piix
Dec 01 08:35:58 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 01 08:35:58 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 01 08:35:58 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 01 08:35:58 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 01 08:35:58 localhost systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec 01 08:35:58 localhost systemd[1]: Reached target Initrd Root Device.
Dec 01 08:35:58 localhost systemd[1]: Reached target System Initialization.
Dec 01 08:35:58 localhost systemd[1]: Reached target Basic System.
Dec 01 08:35:58 localhost kernel: ata1: found unknown device (class 0)
Dec 01 08:35:58 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 01 08:35:58 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 01 08:35:58 localhost systemd-udevd[478]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 08:35:58 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 01 08:35:58 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 01 08:35:58 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 01 08:35:58 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 01 08:35:58 localhost systemd[1]: Finished dracut initqueue hook.
Dec 01 08:35:58 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 01 08:35:58 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 01 08:35:58 localhost systemd[1]: Reached target Remote File Systems.
Dec 01 08:35:58 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 01 08:35:58 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 01 08:35:58 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Dec 01 08:35:58 localhost systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Dec 01 08:35:58 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec 01 08:35:58 localhost systemd[1]: Mounting /sysroot...
Dec 01 08:35:58 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 01 08:35:58 localhost kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Dec 01 08:35:59 localhost kernel: XFS (vda1): Ending clean mount
Dec 01 08:35:59 localhost systemd[1]: Mounted /sysroot.
Dec 01 08:35:59 localhost systemd[1]: Reached target Initrd Root File System.
Dec 01 08:35:59 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 01 08:35:59 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 01 08:35:59 localhost systemd[1]: Reached target Initrd File Systems.
Dec 01 08:35:59 localhost systemd[1]: Reached target Initrd Default Target.
Dec 01 08:35:59 localhost systemd[1]: Starting dracut mount hook...
Dec 01 08:35:59 localhost systemd[1]: Finished dracut mount hook.
Dec 01 08:35:59 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 01 08:35:59 localhost rpc.idmapd[446]: exiting on signal 15
Dec 01 08:35:59 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 01 08:35:59 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 01 08:35:59 localhost systemd[1]: Stopped target Network.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Timer Units.
Dec 01 08:35:59 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 01 08:35:59 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Basic System.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Path Units.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Remote File Systems.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Slice Units.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Socket Units.
Dec 01 08:35:59 localhost systemd[1]: Stopped target System Initialization.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Local File Systems.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Swaps.
Dec 01 08:35:59 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped dracut mount hook.
Dec 01 08:35:59 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 01 08:35:59 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 01 08:35:59 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 01 08:35:59 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 01 08:35:59 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 01 08:35:59 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 01 08:35:59 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 01 08:35:59 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 01 08:35:59 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 01 08:35:59 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 01 08:35:59 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 01 08:35:59 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 01 08:35:59 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Closed udev Control Socket.
Dec 01 08:35:59 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Closed udev Kernel Socket.
Dec 01 08:35:59 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 01 08:35:59 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 01 08:35:59 localhost systemd[1]: Starting Cleanup udev Database...
Dec 01 08:35:59 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 01 08:35:59 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 01 08:35:59 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Stopped Create System Users.
Dec 01 08:35:59 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 01 08:35:59 localhost systemd[1]: Finished Cleanup udev Database.
Dec 01 08:35:59 localhost systemd[1]: Reached target Switch Root.
Dec 01 08:35:59 localhost systemd[1]: Starting Switch Root...
Dec 01 08:35:59 localhost systemd[1]: Switching root.
Dec 01 08:35:59 localhost systemd-journald[305]: Journal stopped
Dec 01 08:36:00 localhost systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Dec 01 08:36:00 localhost kernel: audit: type=1404 audit(1764578159.503:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 01 08:36:00 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 08:36:00 localhost kernel: SELinux:  policy capability open_perms=1
Dec 01 08:36:00 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 08:36:00 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 01 08:36:00 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 08:36:00 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 08:36:00 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 08:36:00 localhost kernel: audit: type=1403 audit(1764578159.665:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 01 08:36:00 localhost systemd[1]: Successfully loaded SELinux policy in 165.213ms.
Dec 01 08:36:00 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.076ms.
Dec 01 08:36:00 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 01 08:36:00 localhost systemd[1]: Detected virtualization kvm.
Dec 01 08:36:00 localhost systemd[1]: Detected architecture x86-64.
Dec 01 08:36:00 localhost systemd-rc-local-generator[636]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 08:36:00 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 01 08:36:00 localhost systemd[1]: Stopped Switch Root.
Dec 01 08:36:00 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 01 08:36:00 localhost systemd[1]: Created slice Slice /system/getty.
Dec 01 08:36:00 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 01 08:36:00 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 01 08:36:00 localhost systemd[1]: Created slice User and Session Slice.
Dec 01 08:36:00 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 01 08:36:00 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 01 08:36:00 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 01 08:36:00 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 01 08:36:00 localhost systemd[1]: Stopped target Switch Root.
Dec 01 08:36:00 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 01 08:36:00 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 01 08:36:00 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 01 08:36:00 localhost systemd[1]: Reached target Path Units.
Dec 01 08:36:00 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 01 08:36:00 localhost systemd[1]: Reached target Slice Units.
Dec 01 08:36:00 localhost systemd[1]: Reached target Swaps.
Dec 01 08:36:00 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 01 08:36:00 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 01 08:36:00 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 01 08:36:00 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 01 08:36:00 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 01 08:36:00 localhost systemd[1]: Listening on udev Control Socket.
Dec 01 08:36:00 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 01 08:36:00 localhost systemd[1]: Mounting Huge Pages File System...
Dec 01 08:36:00 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 01 08:36:00 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 01 08:36:00 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 01 08:36:00 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 01 08:36:00 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 01 08:36:00 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 01 08:36:00 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 01 08:36:00 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 01 08:36:00 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 01 08:36:00 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 01 08:36:00 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 01 08:36:00 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 01 08:36:00 localhost systemd[1]: Stopped Journal Service.
Dec 01 08:36:00 localhost kernel: fuse: init (API version 7.37)
Dec 01 08:36:00 localhost systemd[1]: Starting Journal Service...
Dec 01 08:36:00 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 01 08:36:00 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 01 08:36:00 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 01 08:36:00 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 01 08:36:00 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 01 08:36:00 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 01 08:36:00 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 01 08:36:00 localhost systemd-journald[677]: Journal started
Dec 01 08:36:00 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec 01 08:35:59 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 01 08:35:59 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 01 08:36:00 localhost systemd[1]: Started Journal Service.
Dec 01 08:36:00 localhost systemd[1]: Mounted Huge Pages File System.
Dec 01 08:36:00 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 01 08:36:00 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 01 08:36:00 localhost kernel: ACPI: bus type drm_connector registered
Dec 01 08:36:00 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 01 08:36:00 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 01 08:36:00 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 01 08:36:00 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 01 08:36:00 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 01 08:36:00 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 01 08:36:00 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 01 08:36:00 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 01 08:36:00 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 01 08:36:00 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 01 08:36:00 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 01 08:36:00 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 01 08:36:00 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 01 08:36:00 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 01 08:36:00 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 01 08:36:00 localhost systemd[1]: Mounting FUSE Control File System...
Dec 01 08:36:00 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 01 08:36:00 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 01 08:36:00 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 01 08:36:00 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 01 08:36:00 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 01 08:36:00 localhost systemd[1]: Starting Create System Users...
Dec 01 08:36:00 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec 01 08:36:00 localhost systemd-journald[677]: Received client request to flush runtime journal.
Dec 01 08:36:00 localhost systemd[1]: Mounted FUSE Control File System.
Dec 01 08:36:00 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 01 08:36:00 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 01 08:36:00 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 01 08:36:00 localhost systemd[1]: Finished Create System Users.
Dec 01 08:36:00 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 01 08:36:00 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 01 08:36:00 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 01 08:36:00 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 01 08:36:00 localhost systemd[1]: Reached target Local File Systems.
Dec 01 08:36:00 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 01 08:36:00 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 01 08:36:00 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 01 08:36:00 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 01 08:36:00 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 01 08:36:00 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 01 08:36:00 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 01 08:36:00 localhost bootctl[696]: Couldn't find EFI system partition, skipping.
Dec 01 08:36:00 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 01 08:36:00 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 01 08:36:00 localhost systemd[1]: Starting Security Auditing Service...
Dec 01 08:36:00 localhost systemd[1]: Starting RPC Bind...
Dec 01 08:36:00 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 01 08:36:00 localhost auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 01 08:36:00 localhost auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 01 08:36:00 localhost systemd[1]: Started RPC Bind.
Dec 01 08:36:00 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 01 08:36:00 localhost augenrules[707]: /sbin/augenrules: No change
Dec 01 08:36:00 localhost augenrules[723]: No rules
Dec 01 08:36:00 localhost augenrules[723]: enabled 1
Dec 01 08:36:00 localhost augenrules[723]: failure 1
Dec 01 08:36:00 localhost augenrules[723]: pid 702
Dec 01 08:36:00 localhost augenrules[723]: rate_limit 0
Dec 01 08:36:00 localhost augenrules[723]: backlog_limit 8192
Dec 01 08:36:00 localhost augenrules[723]: lost 0
Dec 01 08:36:00 localhost augenrules[723]: backlog 2
Dec 01 08:36:00 localhost augenrules[723]: backlog_wait_time 60000
Dec 01 08:36:00 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 01 08:36:00 localhost augenrules[723]: enabled 1
Dec 01 08:36:00 localhost augenrules[723]: failure 1
Dec 01 08:36:00 localhost augenrules[723]: pid 702
Dec 01 08:36:00 localhost augenrules[723]: rate_limit 0
Dec 01 08:36:00 localhost augenrules[723]: backlog_limit 8192
Dec 01 08:36:00 localhost augenrules[723]: lost 0
Dec 01 08:36:00 localhost augenrules[723]: backlog 2
Dec 01 08:36:00 localhost augenrules[723]: backlog_wait_time 60000
Dec 01 08:36:00 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 01 08:36:00 localhost augenrules[723]: enabled 1
Dec 01 08:36:00 localhost augenrules[723]: failure 1
Dec 01 08:36:00 localhost augenrules[723]: pid 702
Dec 01 08:36:00 localhost augenrules[723]: rate_limit 0
Dec 01 08:36:00 localhost augenrules[723]: backlog_limit 8192
Dec 01 08:36:00 localhost augenrules[723]: lost 0
Dec 01 08:36:00 localhost augenrules[723]: backlog 0
Dec 01 08:36:00 localhost augenrules[723]: backlog_wait_time 60000
Dec 01 08:36:00 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 01 08:36:00 localhost systemd[1]: Started Security Auditing Service.
Dec 01 08:36:00 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 01 08:36:00 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 01 08:36:00 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 01 08:36:00 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 01 08:36:00 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 01 08:36:01 localhost systemd[1]: Starting Update is Completed...
Dec 01 08:36:01 localhost systemd[1]: Finished Update is Completed.
Dec 01 08:36:01 localhost systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Dec 01 08:36:01 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 01 08:36:01 localhost systemd[1]: Reached target System Initialization.
Dec 01 08:36:01 localhost systemd[1]: Started dnf makecache --timer.
Dec 01 08:36:01 localhost systemd[1]: Started Daily rotation of log files.
Dec 01 08:36:01 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 01 08:36:01 localhost systemd[1]: Reached target Timer Units.
Dec 01 08:36:01 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 01 08:36:01 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 01 08:36:01 localhost systemd[1]: Reached target Socket Units.
Dec 01 08:36:01 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 01 08:36:01 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 01 08:36:01 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 01 08:36:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 01 08:36:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 01 08:36:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 01 08:36:01 localhost systemd-udevd[736]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 08:36:01 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 01 08:36:01 localhost systemd[1]: Reached target Basic System.
Dec 01 08:36:01 localhost dbus-broker-lau[757]: Ready
Dec 01 08:36:01 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 01 08:36:01 localhost systemd[1]: Starting NTP client/server...
Dec 01 08:36:01 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 01 08:36:01 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 01 08:36:01 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 01 08:36:01 localhost systemd[1]: Started irqbalance daemon.
Dec 01 08:36:01 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 01 08:36:01 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 08:36:01 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 08:36:01 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 08:36:01 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 01 08:36:01 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 01 08:36:01 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 01 08:36:01 localhost systemd[1]: Starting User Login Management...
Dec 01 08:36:01 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 01 08:36:01 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 01 08:36:01 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 01 08:36:01 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 01 08:36:01 localhost chronyd[791]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 01 08:36:01 localhost chronyd[791]: Loaded 0 symmetric keys
Dec 01 08:36:01 localhost chronyd[791]: Using right/UTC timezone to obtain leap second data
Dec 01 08:36:01 localhost chronyd[791]: Loaded seccomp filter (level 2)
Dec 01 08:36:01 localhost systemd[1]: Started NTP client/server.
Dec 01 08:36:01 localhost systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 01 08:36:01 localhost systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 01 08:36:01 localhost systemd-logind[788]: New seat seat0.
Dec 01 08:36:01 localhost systemd[1]: Started User Login Management.
Dec 01 08:36:01 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 01 08:36:01 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 01 08:36:01 localhost kernel: Console: switching to colour dummy device 80x25
Dec 01 08:36:01 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 01 08:36:01 localhost kernel: [drm] features: -context_init
Dec 01 08:36:01 localhost kernel: [drm] number of scanouts: 1
Dec 01 08:36:01 localhost kernel: [drm] number of cap sets: 0
Dec 01 08:36:01 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 01 08:36:01 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 01 08:36:01 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 01 08:36:01 localhost kernel: kvm_amd: TSC scaling supported
Dec 01 08:36:01 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 01 08:36:01 localhost kernel: kvm_amd: Nested Paging enabled
Dec 01 08:36:01 localhost kernel: kvm_amd: LBR virtualization supported
Dec 01 08:36:01 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 01 08:36:01 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 01 08:36:01 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 01 08:36:01 localhost iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Dec 01 08:36:01 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 01 08:36:01 localhost cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 01 Dec 2025 08:36:01 +0000. Up 6.64 seconds.
Dec 01 08:36:01 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 01 08:36:01 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 01 08:36:01 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp5905b2m4.mount: Deactivated successfully.
Dec 01 08:36:01 localhost systemd[1]: Starting Hostname Service...
Dec 01 08:36:02 localhost systemd[1]: Started Hostname Service.
Dec 01 08:36:02 np0005540741.novalocal systemd-hostnamed[853]: Hostname set to <np0005540741.novalocal> (static)
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Reached target Preparation for Network.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Starting Network Manager...
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.1846] NetworkManager (version 1.54.1-1.el9) is starting... (boot:fbf967f0-219c-4ceb-b589-3e4f3756d2b4)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.1850] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2031] manager[0x562f6a532080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2077] hostname: hostname: using hostnamed
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2077] hostname: static hostname changed from (none) to "np0005540741.novalocal"
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2080] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2170] manager[0x562f6a532080]: rfkill: Wi-Fi hardware radio set enabled
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2170] manager[0x562f6a532080]: rfkill: WWAN hardware radio set enabled
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2204] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2204] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2205] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2206] manager: Networking is enabled by state file
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2208] settings: Loaded settings plugin: keyfile (internal)
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2222] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2243] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2253] dhcp: init: Using DHCP client 'internal'
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2255] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2266] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2273] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2279] device (lo): Activation: starting connection 'lo' (d32b959f-25b9-49e2-b1c9-8c743b9b7f56)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2286] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2288] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2313] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2316] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2318] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2319] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2321] device (eth0): carrier: link connected
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2324] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2329] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2337] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2340] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2341] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2343] manager: NetworkManager state is now CONNECTING
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2344] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2349] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2351] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Started Network Manager.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Reached target Network.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2583] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2604] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 01 08:36:02 np0005540741.novalocal NetworkManager[858]: <info>  [1764578162.2609] device (lo): Activation: successful, device activated.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Reached target NFS client services.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: Reached target Remote File Systems.
Dec 01 08:36:02 np0005540741.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0247] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0260] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0282] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0324] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0326] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0330] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0334] device (eth0): Activation: successful, device activated.
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0338] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 01 08:36:03 np0005540741.novalocal NetworkManager[858]: <info>  [1764578163.0340] manager: startup complete
Dec 01 08:36:03 np0005540741.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 01 08:36:03 np0005540741.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 01 Dec 2025 08:36:03 +0000. Up 8.32 seconds.
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.132         | 255.255.255.0 | global | fa:16:3e:f2:32:3e |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fef2:323e/64 |       .       |  link  | fa:16:3e:f2:32:3e |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 01 08:36:03 np0005540741.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 01 08:36:04 np0005540741.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Dec 01 08:36:04 np0005540741.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 01 08:36:04 np0005540741.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Dec 01 08:36:04 np0005540741.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Dec 01 08:36:04 np0005540741.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Dec 01 08:36:04 np0005540741.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Generating public/private rsa key pair.
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: The key fingerprint is:
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: SHA256:xQhBWXSaL4ozNKJ/ph2WpJ0r09kyGA9f1WiFzpbkruk root@np0005540741.novalocal
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: The key's randomart image is:
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: +---[RSA 3072]----+
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |     .+=o..      |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |      ..o*.      |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |       =+=o      |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |        Xo.      |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |  . +  =S .      |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: | .o* =....       |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |. .BX+.o         |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: | .+oX=+          |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |  o*o+E          |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: The key fingerprint is:
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: SHA256:XmS/4ctkAJpdn8d7Jc1bC1X6Xenqra9xpEmasUge6zE root@np0005540741.novalocal
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: The key's randomart image is:
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: +---[ECDSA 256]---+
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |                .|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |               .o|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |        . +   .o.|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |       + = o oo+o|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |      o Soo.=o+oB|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |       .o.+o*+*o=|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |        .E +==.+.|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |        . o+..+. |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |         .  o++o |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: The key fingerprint is:
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: SHA256:nJTsG1uou84clkzJUTgP2QTX6XOjIVcfBfng6FGH1V8 root@np0005540741.novalocal
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: The key's randomart image is:
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: +--[ED25519 256]--+
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |      .*+. .  .*+|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |      =+..o . * E|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |      .++. . = =o|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |     . *oo= = o o|
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |      + So.* o   |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |     o o =. .    |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |      * o        |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |     + o         |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: |     .*.         |
Dec 01 08:36:04 np0005540741.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Reached target Network is Online.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Starting System Logging Service...
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 01 08:36:04 np0005540741.novalocal sm-notify[1006]: Version 2.5.4 starting
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Starting Permit User Sessions...
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 01 08:36:04 np0005540741.novalocal sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 01 08:36:04 np0005540741.novalocal sshd[1008]: Server listening on :: port 22.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Finished Permit User Sessions.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Started Command Scheduler.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Started Getty on tty1.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 01 08:36:04 np0005540741.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Dec 01 08:36:04 np0005540741.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 01 08:36:04 np0005540741.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 62% if used.)
Dec 01 08:36:04 np0005540741.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Reached target Login Prompts.
Dec 01 08:36:04 np0005540741.novalocal rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec 01 08:36:04 np0005540741.novalocal rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Started System Logging Service.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Reached target Multi-User System.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 01 08:36:04 np0005540741.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 01 08:36:04 np0005540741.novalocal rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 08:36:04 np0005540741.novalocal kdumpctl[1019]: kdump: No kdump initial ramdisk found.
Dec 01 08:36:04 np0005540741.novalocal kdumpctl[1019]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Dec 01 08:36:04 np0005540741.novalocal cloud-init[1144]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 01 Dec 2025 08:36:04 +0000. Up 9.87 seconds.
Dec 01 08:36:05 np0005540741.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 01 08:36:05 np0005540741.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 01 08:36:05 np0005540741.novalocal dracut[1267]: dracut-057-102.git20250818.el9
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1285]: Connection reset by 38.102.83.114 port 34910 [preauth]
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1293]: Unable to negotiate with 38.102.83.114 port 57614: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1300]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 01 Dec 2025 08:36:05 +0000. Up 10.27 seconds.
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1313]: Unable to negotiate with 38.102.83.114 port 57638: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1318]: Unable to negotiate with 38.102.83.114 port 57640: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1339]: #############################################################
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1343]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1330]: Connection reset by 38.102.83.114 port 57648 [preauth]
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1349]: 256 SHA256:XmS/4ctkAJpdn8d7Jc1bC1X6Xenqra9xpEmasUge6zE root@np0005540741.novalocal (ECDSA)
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1356]: 256 SHA256:nJTsG1uou84clkzJUTgP2QTX6XOjIVcfBfng6FGH1V8 root@np0005540741.novalocal (ED25519)
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1360]: 3072 SHA256:xQhBWXSaL4ozNKJ/ph2WpJ0r09kyGA9f1WiFzpbkruk root@np0005540741.novalocal (RSA)
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1348]: Connection reset by 38.102.83.114 port 57664 [preauth]
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1361]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1363]: #############################################################
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1362]: Unable to negotiate with 38.102.83.114 port 57672: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1368]: Unable to negotiate with 38.102.83.114 port 57688: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 01 08:36:05 np0005540741.novalocal sshd-session[1303]: Connection closed by 38.102.83.114 port 57622 [preauth]
Dec 01 08:36:05 np0005540741.novalocal cloud-init[1300]: Cloud-init v. 24.4-7.el9 finished at Mon, 01 Dec 2025 08:36:05 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.45 seconds
Dec 01 08:36:05 np0005540741.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 01 08:36:05 np0005540741.novalocal systemd[1]: Reached target Cloud-init target.
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 01 08:36:05 np0005540741.novalocal dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: memstrack is not available
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: memstrack is not available
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: *** Including module: systemd ***
Dec 01 08:36:06 np0005540741.novalocal dracut[1269]: *** Including module: fips ***
Dec 01 08:36:07 np0005540741.novalocal dracut[1269]: *** Including module: systemd-initrd ***
Dec 01 08:36:07 np0005540741.novalocal dracut[1269]: *** Including module: i18n ***
Dec 01 08:36:07 np0005540741.novalocal dracut[1269]: *** Including module: drm ***
Dec 01 08:36:07 np0005540741.novalocal dracut[1269]: *** Including module: prefixdevname ***
Dec 01 08:36:07 np0005540741.novalocal dracut[1269]: *** Including module: kernel-modules ***
Dec 01 08:36:07 np0005540741.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 01 08:36:08 np0005540741.novalocal chronyd[791]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Dec 01 08:36:08 np0005540741.novalocal chronyd[791]: System clock TAI offset set to 37 seconds
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]: *** Including module: kernel-modules-extra ***
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]: *** Including module: qemu ***
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]: *** Including module: fstab-sys ***
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]: *** Including module: rootfs-block ***
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]: *** Including module: terminfo ***
Dec 01 08:36:08 np0005540741.novalocal dracut[1269]: *** Including module: udev-rules ***
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: Skipping udev rule: 91-permissions.rules
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: *** Including module: virtiofs ***
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: *** Including module: dracut-systemd ***
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: *** Including module: usrmount ***
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: *** Including module: base ***
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: *** Including module: fs-lib ***
Dec 01 08:36:09 np0005540741.novalocal dracut[1269]: *** Including module: kdumpbase ***
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:   microcode_ctl module: mangling fw_dir
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]: *** Including module: openssl ***
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]: *** Including module: shutdown ***
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]: *** Including module: squash ***
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]: *** Including modules done ***
Dec 01 08:36:10 np0005540741.novalocal dracut[1269]: *** Installing kernel module dependencies ***
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: IRQ 25 affinity is now unmanaged
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: IRQ 31 affinity is now unmanaged
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: IRQ 28 affinity is now unmanaged
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: IRQ 32 affinity is now unmanaged
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: IRQ 30 affinity is now unmanaged
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 01 08:36:11 np0005540741.novalocal irqbalance[783]: IRQ 29 affinity is now unmanaged
Dec 01 08:36:11 np0005540741.novalocal dracut[1269]: *** Installing kernel module dependencies done ***
Dec 01 08:36:11 np0005540741.novalocal dracut[1269]: *** Resolving executable dependencies ***
Dec 01 08:36:13 np0005540741.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 08:36:13 np0005540741.novalocal dracut[1269]: *** Resolving executable dependencies done ***
Dec 01 08:36:13 np0005540741.novalocal dracut[1269]: *** Generating early-microcode cpio image ***
Dec 01 08:36:13 np0005540741.novalocal dracut[1269]: *** Store current command line parameters ***
Dec 01 08:36:13 np0005540741.novalocal dracut[1269]: Stored kernel commandline:
Dec 01 08:36:13 np0005540741.novalocal dracut[1269]: No dracut internal kernel commandline stored in the initramfs
Dec 01 08:36:13 np0005540741.novalocal dracut[1269]: *** Install squash loader ***
Dec 01 08:36:14 np0005540741.novalocal dracut[1269]: *** Squashing the files inside the initramfs ***
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: *** Squashing the files inside the initramfs done ***
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: *** Hardlinking files ***
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: Mode:           real
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: Files:          50
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: Linked:         0 files
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: Compared:       0 xattrs
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: Compared:       0 files
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: Saved:          0 B
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: Duration:       0.000780 seconds
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: *** Hardlinking files done ***
Dec 01 08:36:15 np0005540741.novalocal dracut[1269]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Dec 01 08:36:16 np0005540741.novalocal kdumpctl[1019]: kdump: kexec: loaded kdump kernel
Dec 01 08:36:16 np0005540741.novalocal kdumpctl[1019]: kdump: Starting kdump: [OK]
Dec 01 08:36:16 np0005540741.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 01 08:36:16 np0005540741.novalocal systemd[1]: Startup finished in 1.857s (kernel) + 2.605s (initrd) + 17.040s (userspace) = 21.503s.
Dec 01 08:36:32 np0005540741.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 08:36:33 np0005540741.novalocal sshd-session[4298]: Accepted publickey for zuul from 38.102.83.114 port 49226 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 01 08:36:34 np0005540741.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 01 08:36:34 np0005540741.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 01 08:36:34 np0005540741.novalocal systemd-logind[788]: New session 1 of user zuul.
Dec 01 08:36:34 np0005540741.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 01 08:36:34 np0005540741.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Queued start job for default target Main User Target.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Created slice User Application Slice.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Reached target Paths.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Reached target Timers.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Starting D-Bus User Message Bus Socket...
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Starting Create User's Volatile Files and Directories...
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Finished Create User's Volatile Files and Directories.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Listening on D-Bus User Message Bus Socket.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Reached target Sockets.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Reached target Basic System.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Reached target Main User Target.
Dec 01 08:36:34 np0005540741.novalocal systemd[4302]: Startup finished in 132ms.
Dec 01 08:36:34 np0005540741.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 01 08:36:34 np0005540741.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 01 08:36:34 np0005540741.novalocal sshd-session[4298]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 08:36:34 np0005540741.novalocal python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 08:36:37 np0005540741.novalocal python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 08:36:42 np0005540741.novalocal python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 08:36:43 np0005540741.novalocal python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 01 08:36:45 np0005540741.novalocal python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWBn8adtzj5VQSMTGCbPZbb2t89ydBfmt2xnyrzcW5lk8iyDmU7stlRl54OXCT92s+i4aEedo3N8/84mj25AaJuCIy6nrmTOHyLnfcjuosGYCnwHzCe19VyrE/wDLd61C0JtSfnBmMiz2v89KKX1dwLfQ8rY6+SsNUONSbNacinrODyDCJAZX+8BD0WWCHAFXp1sJrMs03LwF6slZnK38R/nNniLlW5wrtwmsinG8g3TYTMxhnoleJgzOOOdLLN17z+IyHtpK/U82kBeP3113pUfJt+oNS/yFZJvzATFsc5sbQwPqscJ/tuge5khq+PAMcFQnfPLwl8sWM+bmMT/nybM1cGGvMR9sodRHwRFNoluvDjYHvT/sTGItVocsh+4rwmhxxVv7eWZhxPcChUvuOA1/hlWYHlie6GhVjWn2363noxAZXR4xarW++iECASNQbL03ddJYNKsQoaUGwrCG7uJcrMVgNYCsQScqEq7uh0Kw75SGqU6KbKJgfZ/N5Gss= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:45 np0005540741.novalocal python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:46 np0005540741.novalocal python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:36:46 np0005540741.novalocal python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764578206.070604-207-198921412613347/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=b2ce5c11e0624cf7b75dcf49498569ff_id_rsa follow=False checksum=feda567e9354865c74d371505b3546f00914f204 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:47 np0005540741.novalocal python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:36:47 np0005540741.novalocal python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764578207.1587334-240-59453237360859/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=b2ce5c11e0624cf7b75dcf49498569ff_id_rsa.pub follow=False checksum=9e3220d039ae9e5c26ede6336dc219a70d0b7eba backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:49 np0005540741.novalocal python3[4972]: ansible-ping Invoked with data=pong
Dec 01 08:36:50 np0005540741.novalocal python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 08:36:51 np0005540741.novalocal python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 01 08:36:52 np0005540741.novalocal python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:52 np0005540741.novalocal python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:53 np0005540741.novalocal python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:53 np0005540741.novalocal python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:53 np0005540741.novalocal python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:54 np0005540741.novalocal python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:55 np0005540741.novalocal sudo[5230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olbtfylwdtloqrhbezswoqxamfebqhyq ; /usr/bin/python3'
Dec 01 08:36:55 np0005540741.novalocal sudo[5230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:36:55 np0005540741.novalocal python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:55 np0005540741.novalocal sudo[5230]: pam_unix(sudo:session): session closed for user root
Dec 01 08:36:55 np0005540741.novalocal sudo[5308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpezdagacycagfxznlaskxmnzenthipz ; /usr/bin/python3'
Dec 01 08:36:55 np0005540741.novalocal sudo[5308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:36:56 np0005540741.novalocal python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:36:56 np0005540741.novalocal sudo[5308]: pam_unix(sudo:session): session closed for user root
Dec 01 08:36:56 np0005540741.novalocal sudo[5381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukuughstpzzrniizdwrcepkwupogqdhw ; /usr/bin/python3'
Dec 01 08:36:56 np0005540741.novalocal sudo[5381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:36:56 np0005540741.novalocal python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578215.640814-21-243403188566425/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:36:56 np0005540741.novalocal sudo[5381]: pam_unix(sudo:session): session closed for user root
Dec 01 08:36:57 np0005540741.novalocal python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:57 np0005540741.novalocal python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:57 np0005540741.novalocal python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:57 np0005540741.novalocal python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:58 np0005540741.novalocal python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:58 np0005540741.novalocal python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:58 np0005540741.novalocal python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:59 np0005540741.novalocal python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:59 np0005540741.novalocal python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:59 np0005540741.novalocal python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:36:59 np0005540741.novalocal python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:00 np0005540741.novalocal python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:00 np0005540741.novalocal python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:00 np0005540741.novalocal python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:00 np0005540741.novalocal python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:01 np0005540741.novalocal python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:01 np0005540741.novalocal python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:01 np0005540741.novalocal python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:02 np0005540741.novalocal python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:02 np0005540741.novalocal python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:02 np0005540741.novalocal python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:02 np0005540741.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:03 np0005540741.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:03 np0005540741.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:03 np0005540741.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:03 np0005540741.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:37:06 np0005540741.novalocal sudo[6055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yokthrwslcbqukjizttqyffnhsvcboaw ; /usr/bin/python3'
Dec 01 08:37:06 np0005540741.novalocal sudo[6055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:06 np0005540741.novalocal python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 01 08:37:06 np0005540741.novalocal systemd[1]: Starting Time & Date Service...
Dec 01 08:37:06 np0005540741.novalocal systemd[1]: Started Time & Date Service.
Dec 01 08:37:06 np0005540741.novalocal systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Dec 01 08:37:06 np0005540741.novalocal sudo[6055]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:07 np0005540741.novalocal sudo[6086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-newlkaqwdswsxmafousuekohmlkjxizh ; /usr/bin/python3'
Dec 01 08:37:07 np0005540741.novalocal sudo[6086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:07 np0005540741.novalocal python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:37:07 np0005540741.novalocal sudo[6086]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:07 np0005540741.novalocal python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:37:07 np0005540741.novalocal python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764578227.380935-153-225488970151633/source _original_basename=tmp_qz7o2l_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:37:08 np0005540741.novalocal python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:37:08 np0005540741.novalocal python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764578228.1802669-183-160260386879000/source _original_basename=tmphofnz4o9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:37:09 np0005540741.novalocal sudo[6506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkpbjnvgkfunweqmpmbhexqqphwovbqx ; /usr/bin/python3'
Dec 01 08:37:09 np0005540741.novalocal sudo[6506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:09 np0005540741.novalocal python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:37:09 np0005540741.novalocal sudo[6506]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:09 np0005540741.novalocal sudo[6579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdumbwuyiehpptwdwotqkgqjoyxovgyf ; /usr/bin/python3'
Dec 01 08:37:09 np0005540741.novalocal sudo[6579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:09 np0005540741.novalocal python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764578229.2338011-231-239681911058611/source _original_basename=tmpd97heb_4 follow=False checksum=2e193f101b911db5e638a5fc33120ba1c99c8f88 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:37:09 np0005540741.novalocal sudo[6579]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:10 np0005540741.novalocal python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:37:10 np0005540741.novalocal python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:37:10 np0005540741.novalocal sudo[6733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zggkekssaxajukqvfhcntmidrctleopa ; /usr/bin/python3'
Dec 01 08:37:10 np0005540741.novalocal sudo[6733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:11 np0005540741.novalocal python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:37:11 np0005540741.novalocal sudo[6733]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:11 np0005540741.novalocal sudo[6806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsizgfhssbrjqynmbozdhtrlorhoetor ; /usr/bin/python3'
Dec 01 08:37:11 np0005540741.novalocal sudo[6806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:11 np0005540741.novalocal python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578230.8271308-273-28826554359574/source _original_basename=tmpj4g8imht follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:37:11 np0005540741.novalocal sudo[6806]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:11 np0005540741.novalocal sudo[6857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywisfubfikuahoalduupduehhdsveibt ; /usr/bin/python3'
Dec 01 08:37:11 np0005540741.novalocal sudo[6857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:12 np0005540741.novalocal python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-d947-2e6a-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:37:12 np0005540741.novalocal sudo[6857]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:12 np0005540741.novalocal python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-d947-2e6a-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 01 08:37:13 np0005540741.novalocal python3[6916]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:37:33 np0005540741.novalocal sudo[6940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yisnaenhfonpnamagwrqvrguhidwntin ; /usr/bin/python3'
Dec 01 08:37:33 np0005540741.novalocal sudo[6940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:37:33 np0005540741.novalocal python3[6942]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:37:33 np0005540741.novalocal sudo[6940]: pam_unix(sudo:session): session closed for user root
Dec 01 08:37:36 np0005540741.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 01 08:38:18 np0005540741.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 01 08:38:18 np0005540741.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9251] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 01 08:38:18 np0005540741.novalocal systemd-udevd[6946]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9464] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9512] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9522] device (eth1): carrier: link connected
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9526] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9539] policy: auto-activating connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185)
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9548] device (eth1): Activation: starting connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185)
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9550] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9558] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9567] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 08:38:18 np0005540741.novalocal NetworkManager[858]: <info>  [1764578298.9577] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 01 08:38:20 np0005540741.novalocal python3[6972]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-bd7e-bab5-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:38:29 np0005540741.novalocal sudo[7050]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyuiyvweydvgvykjhbyxhpemuzdeabzl ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 08:38:29 np0005540741.novalocal sudo[7050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:38:30 np0005540741.novalocal python3[7052]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:38:30 np0005540741.novalocal sudo[7050]: pam_unix(sudo:session): session closed for user root
Dec 01 08:38:30 np0005540741.novalocal sudo[7123]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mabjyysiqjdhhviywxzbstypxekpdqdw ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 08:38:30 np0005540741.novalocal sudo[7123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:38:30 np0005540741.novalocal python3[7125]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764578309.7922854-102-9430775054342/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=351d510ee20d95814e9a8640058fdb7b2e7669b8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:38:30 np0005540741.novalocal sudo[7123]: pam_unix(sudo:session): session closed for user root
Dec 01 08:38:31 np0005540741.novalocal sudo[7173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpfvbwbfpdvaltzxklisjpqshtmegdub ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 08:38:31 np0005540741.novalocal sudo[7173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:38:31 np0005540741.novalocal python3[7175]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Stopping Network Manager...
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3104] caught SIGTERM, shutting down normally.
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3116] dhcp4 (eth0): canceled DHCP transaction
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3117] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3117] dhcp4 (eth0): state changed no lease
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3119] manager: NetworkManager state is now CONNECTING
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3233] dhcp4 (eth1): canceled DHCP transaction
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3233] dhcp4 (eth1): state changed no lease
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[858]: <info>  [1764578311.3268] exiting (success)
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Stopped Network Manager.
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: NetworkManager.service: Consumed 1.022s CPU time, 10.1M memory peak.
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Starting Network Manager...
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.3769] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:fbf967f0-219c-4ceb-b589-3e4f3756d2b4)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.3772] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.3828] manager[0x5620120dd070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Starting Hostname Service...
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Started Hostname Service.
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4893] hostname: hostname: using hostnamed
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4897] hostname: static hostname changed from (none) to "np0005540741.novalocal"
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4905] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4910] manager[0x5620120dd070]: rfkill: Wi-Fi hardware radio set enabled
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4911] manager[0x5620120dd070]: rfkill: WWAN hardware radio set enabled
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4955] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4955] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4956] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4957] manager: Networking is enabled by state file
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4961] settings: Loaded settings plugin: keyfile (internal)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.4968] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5011] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5026] dhcp: init: Using DHCP client 'internal'
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5031] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5038] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5046] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5059] device (lo): Activation: starting connection 'lo' (d32b959f-25b9-49e2-b1c9-8c743b9b7f56)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5069] device (eth0): carrier: link connected
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5075] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5083] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5084] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5095] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5106] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5116] device (eth1): carrier: link connected
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5122] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5129] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185) (indicated)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5129] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5137] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5151] device (eth1): Activation: starting connection 'Wired connection 1' (c6b7d4f6-4237-35c7-90cb-622f3da1d185)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5160] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Started Network Manager.
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5168] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5173] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5176] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5180] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5187] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5190] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5194] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5198] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5207] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5212] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5226] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5229] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5252] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5261] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5269] device (lo): Activation: successful, device activated.
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5279] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5290] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 01 08:38:31 np0005540741.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 01 08:38:31 np0005540741.novalocal sudo[7173]: pam_unix(sudo:session): session closed for user root
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5432] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5469] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5472] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5475] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5478] device (eth0): Activation: successful, device activated.
Dec 01 08:38:31 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578311.5483] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 01 08:38:31 np0005540741.novalocal python3[7259]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-bd7e-bab5-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:38:41 np0005540741.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 08:39:01 np0005540741.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 08:39:06 np0005540741.novalocal systemd[4302]: Starting Mark boot as successful...
Dec 01 08:39:06 np0005540741.novalocal systemd[4302]: Finished Mark boot as successful.
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0314] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 08:39:17 np0005540741.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 08:39:17 np0005540741.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0654] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0657] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0666] device (eth1): Activation: successful, device activated.
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0671] manager: startup complete
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0676] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <warn>  [1764578357.0680] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0687] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0773] dhcp4 (eth1): canceled DHCP transaction
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0774] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0774] dhcp4 (eth1): state changed no lease
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0786] policy: auto-activating connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0790] device (eth1): Activation: starting connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0790] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0792] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0796] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0802] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0830] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0831] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 08:39:17 np0005540741.novalocal NetworkManager[7186]: <info>  [1764578357.0834] device (eth1): Activation: successful, device activated.
Dec 01 08:39:27 np0005540741.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 08:39:31 np0005540741.novalocal sshd-session[4311]: Received disconnect from 38.102.83.114 port 49226:11: disconnected by user
Dec 01 08:39:31 np0005540741.novalocal sshd-session[4311]: Disconnected from user zuul 38.102.83.114 port 49226
Dec 01 08:39:31 np0005540741.novalocal sshd-session[4298]: pam_unix(sshd:session): session closed for user zuul
Dec 01 08:39:31 np0005540741.novalocal systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Dec 01 08:39:33 np0005540741.novalocal sshd-session[7288]: Accepted publickey for zuul from 38.102.83.114 port 46422 ssh2: RSA SHA256:wvbSSRHhGqscdLbt7uF108h9jRKS/yXgXkyPU84jbuE
Dec 01 08:39:33 np0005540741.novalocal systemd-logind[788]: New session 3 of user zuul.
Dec 01 08:39:33 np0005540741.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 01 08:39:33 np0005540741.novalocal sshd-session[7288]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 08:39:34 np0005540741.novalocal sudo[7367]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqgaiyuoefcsusypfjifuamvaajfnvtm ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 08:39:34 np0005540741.novalocal sudo[7367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:39:34 np0005540741.novalocal python3[7369]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:39:34 np0005540741.novalocal sudo[7367]: pam_unix(sudo:session): session closed for user root
Dec 01 08:39:34 np0005540741.novalocal sudo[7440]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eysuxvkqtdmspmtzvifujvspmufobvrk ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 08:39:34 np0005540741.novalocal sudo[7440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:39:34 np0005540741.novalocal python3[7442]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578373.9976506-267-187034409188700/source _original_basename=tmpcmm2z9eo follow=False checksum=72146e9f9cee0111e1af10d9a8bd93298758ed4f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:39:34 np0005540741.novalocal sudo[7440]: pam_unix(sudo:session): session closed for user root
Dec 01 08:39:36 np0005540741.novalocal sshd-session[7291]: Connection closed by 38.102.83.114 port 46422
Dec 01 08:39:36 np0005540741.novalocal sshd-session[7288]: pam_unix(sshd:session): session closed for user zuul
Dec 01 08:39:36 np0005540741.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 01 08:39:36 np0005540741.novalocal systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Dec 01 08:39:36 np0005540741.novalocal systemd-logind[788]: Removed session 3.
Dec 01 08:42:06 np0005540741.novalocal systemd[4302]: Created slice User Background Tasks Slice.
Dec 01 08:42:06 np0005540741.novalocal systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Dec 01 08:42:06 np0005540741.novalocal systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Dec 01 08:46:08 np0005540741.novalocal sshd-session[7472]: Accepted publickey for zuul from 38.102.83.114 port 50286 ssh2: RSA SHA256:wvbSSRHhGqscdLbt7uF108h9jRKS/yXgXkyPU84jbuE
Dec 01 08:46:08 np0005540741.novalocal systemd-logind[788]: New session 4 of user zuul.
Dec 01 08:46:08 np0005540741.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 01 08:46:08 np0005540741.novalocal sshd-session[7472]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 08:46:08 np0005540741.novalocal sudo[7499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztqlqdtnwtsyujxlntqogweowtdqiojt ; /usr/bin/python3'
Dec 01 08:46:08 np0005540741.novalocal sudo[7499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:08 np0005540741.novalocal python3[7501]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-360c-38fc-000000001cd4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:46:08 np0005540741.novalocal sudo[7499]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:08 np0005540741.novalocal sudo[7527]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omegaxvzxgrpnrkdglwrniwvlzocgdqk ; /usr/bin/python3'
Dec 01 08:46:08 np0005540741.novalocal sudo[7527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:08 np0005540741.novalocal python3[7529]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:46:08 np0005540741.novalocal sudo[7527]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:09 np0005540741.novalocal sudo[7554]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpjabgnjentaqiwymabvhwserfsepsdu ; /usr/bin/python3'
Dec 01 08:46:09 np0005540741.novalocal sudo[7554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:09 np0005540741.novalocal python3[7556]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:46:09 np0005540741.novalocal sudo[7554]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:09 np0005540741.novalocal sudo[7580]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcqdueegzhppmuorwdehrelvrteeiism ; /usr/bin/python3'
Dec 01 08:46:09 np0005540741.novalocal sudo[7580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:09 np0005540741.novalocal python3[7582]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:46:09 np0005540741.novalocal sudo[7580]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:09 np0005540741.novalocal sudo[7606]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgusovjtqljbapeefeqghrwimfgafzld ; /usr/bin/python3'
Dec 01 08:46:09 np0005540741.novalocal sudo[7606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:09 np0005540741.novalocal python3[7608]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:46:09 np0005540741.novalocal sudo[7606]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:10 np0005540741.novalocal sudo[7632]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrayujyeomhwbmtkjaodpsysnhgemjxw ; /usr/bin/python3'
Dec 01 08:46:10 np0005540741.novalocal sudo[7632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:10 np0005540741.novalocal python3[7634]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:46:10 np0005540741.novalocal sudo[7632]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:10 np0005540741.novalocal sudo[7710]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poyrhcdioegqxqvhaewxyrsdkaubiwlh ; /usr/bin/python3'
Dec 01 08:46:10 np0005540741.novalocal sudo[7710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:10 np0005540741.novalocal python3[7712]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:46:10 np0005540741.novalocal sudo[7710]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:11 np0005540741.novalocal sudo[7783]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgvljwxdctodqlwoghtocqutoteymtlg ; /usr/bin/python3'
Dec 01 08:46:11 np0005540741.novalocal sudo[7783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:11 np0005540741.novalocal python3[7785]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578770.6514866-479-277063520978790/source _original_basename=tmpwk3dlxpv follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:46:11 np0005540741.novalocal sudo[7783]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:11 np0005540741.novalocal sudo[7833]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxfltnrjtkomaxumurdokdozrirlsare ; /usr/bin/python3'
Dec 01 08:46:11 np0005540741.novalocal sudo[7833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:12 np0005540741.novalocal python3[7835]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 08:46:12 np0005540741.novalocal systemd[1]: Reloading.
Dec 01 08:46:12 np0005540741.novalocal systemd-rc-local-generator[7856]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 08:46:12 np0005540741.novalocal sudo[7833]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:13 np0005540741.novalocal sudo[7889]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyzjcqduuwbsbudupsxjyuttpvrkcdiu ; /usr/bin/python3'
Dec 01 08:46:13 np0005540741.novalocal sudo[7889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:13 np0005540741.novalocal python3[7891]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 01 08:46:13 np0005540741.novalocal sudo[7889]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:14 np0005540741.novalocal sudo[7915]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrahjgaxesfoytaxwhyxqvhgqbpnwmzk ; /usr/bin/python3'
Dec 01 08:46:14 np0005540741.novalocal sudo[7915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:14 np0005540741.novalocal python3[7917]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:46:14 np0005540741.novalocal sudo[7915]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:14 np0005540741.novalocal sudo[7943]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpwzlmdyzzlibrlpyxzquosjqajdkhbv ; /usr/bin/python3'
Dec 01 08:46:14 np0005540741.novalocal sudo[7943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:14 np0005540741.novalocal python3[7945]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:46:14 np0005540741.novalocal sudo[7943]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:14 np0005540741.novalocal sudo[7971]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brcgzuudrmtyifnurmeluglymqzalucm ; /usr/bin/python3'
Dec 01 08:46:14 np0005540741.novalocal sudo[7971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:14 np0005540741.novalocal python3[7973]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:46:14 np0005540741.novalocal sudo[7971]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:15 np0005540741.novalocal sudo[7999]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opmpfikwnyjhtsuedwzaxpgxeuluibtk ; /usr/bin/python3'
Dec 01 08:46:15 np0005540741.novalocal sudo[7999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:15 np0005540741.novalocal python3[8001]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:46:15 np0005540741.novalocal sudo[7999]: pam_unix(sudo:session): session closed for user root
Dec 01 08:46:15 np0005540741.novalocal python3[8028]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-360c-38fc-000000001cdb-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:46:16 np0005540741.novalocal python3[8058]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 08:46:18 np0005540741.novalocal sshd-session[7475]: Connection closed by 38.102.83.114 port 50286
Dec 01 08:46:18 np0005540741.novalocal sshd-session[7472]: pam_unix(sshd:session): session closed for user zuul
Dec 01 08:46:18 np0005540741.novalocal systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Dec 01 08:46:18 np0005540741.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 01 08:46:18 np0005540741.novalocal systemd[1]: session-4.scope: Consumed 4.462s CPU time.
Dec 01 08:46:18 np0005540741.novalocal systemd-logind[788]: Removed session 4.
Dec 01 08:46:19 np0005540741.novalocal sshd-session[8063]: Accepted publickey for zuul from 38.102.83.114 port 52416 ssh2: RSA SHA256:wvbSSRHhGqscdLbt7uF108h9jRKS/yXgXkyPU84jbuE
Dec 01 08:46:19 np0005540741.novalocal systemd-logind[788]: New session 5 of user zuul.
Dec 01 08:46:19 np0005540741.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 01 08:46:19 np0005540741.novalocal sshd-session[8063]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 08:46:19 np0005540741.novalocal sudo[8090]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrnkrruaddkwrarmtyhskzjhasghdudj ; /usr/bin/python3'
Dec 01 08:46:19 np0005540741.novalocal sudo[8090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:46:20 np0005540741.novalocal python3[8092]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 08:46:34 np0005540741.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 08:46:43 np0005540741.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 08:46:51 np0005540741.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 01 08:46:52 np0005540741.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 08:46:52 np0005540741.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 08:46:52 np0005540741.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 08:46:52 np0005540741.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 08:46:52 np0005540741.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 08:46:52 np0005540741.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 08:46:52 np0005540741.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 08:46:53 np0005540741.novalocal setsebool[8158]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 01 08:46:53 np0005540741.novalocal setsebool[8158]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  Converting 388 SID table entries...
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 08:47:03 np0005540741.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 08:47:21 np0005540741.novalocal dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 01 08:47:21 np0005540741.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 08:47:22 np0005540741.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 01 08:47:22 np0005540741.novalocal systemd[1]: Reloading.
Dec 01 08:47:22 np0005540741.novalocal systemd-rc-local-generator[8914]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 08:47:22 np0005540741.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 08:47:23 np0005540741.novalocal sudo[8090]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:25 np0005540741.novalocal python3[11844]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-7826-38cf-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:47:26 np0005540741.novalocal kernel: evm: overlay not supported
Dec 01 08:47:26 np0005540741.novalocal systemd[4302]: Starting D-Bus User Message Bus...
Dec 01 08:47:26 np0005540741.novalocal dbus-broker-launch[12638]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 01 08:47:26 np0005540741.novalocal dbus-broker-launch[12638]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 01 08:47:26 np0005540741.novalocal systemd[4302]: Started D-Bus User Message Bus.
Dec 01 08:47:26 np0005540741.novalocal dbus-broker-lau[12638]: Ready
Dec 01 08:47:26 np0005540741.novalocal systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 01 08:47:26 np0005540741.novalocal systemd[4302]: Created slice Slice /user.
Dec 01 08:47:26 np0005540741.novalocal systemd[4302]: podman-12543.scope: unit configures an IP firewall, but not running as root.
Dec 01 08:47:26 np0005540741.novalocal systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Dec 01 08:47:26 np0005540741.novalocal systemd[4302]: Started podman-12543.scope.
Dec 01 08:47:27 np0005540741.novalocal systemd[4302]: Started podman-pause-2929e1b4.scope.
Dec 01 08:47:27 np0005540741.novalocal sudo[13166]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgdahhwftxebsfoxclozmzcxbelqshxr ; /usr/bin/python3'
Dec 01 08:47:27 np0005540741.novalocal sudo[13166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:47:27 np0005540741.novalocal python3[13188]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.103:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.103:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:47:27 np0005540741.novalocal python3[13188]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 01 08:47:27 np0005540741.novalocal sudo[13166]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:28 np0005540741.novalocal sshd-session[8066]: Connection closed by 38.102.83.114 port 52416
Dec 01 08:47:28 np0005540741.novalocal sshd-session[8063]: pam_unix(sshd:session): session closed for user zuul
Dec 01 08:47:28 np0005540741.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 01 08:47:28 np0005540741.novalocal systemd[1]: session-5.scope: Consumed 58.913s CPU time.
Dec 01 08:47:28 np0005540741.novalocal systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Dec 01 08:47:28 np0005540741.novalocal systemd-logind[788]: Removed session 5.
Dec 01 08:47:46 np0005540741.novalocal sshd-session[20545]: Connection closed by 38.102.83.177 port 36604 [preauth]
Dec 01 08:47:46 np0005540741.novalocal sshd-session[20549]: Unable to negotiate with 38.102.83.177 port 36618: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 01 08:47:46 np0005540741.novalocal sshd-session[20548]: Unable to negotiate with 38.102.83.177 port 36628: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 01 08:47:46 np0005540741.novalocal sshd-session[20547]: Connection closed by 38.102.83.177 port 36602 [preauth]
Dec 01 08:47:46 np0005540741.novalocal sshd-session[20550]: Unable to negotiate with 38.102.83.177 port 36608: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 01 08:47:51 np0005540741.novalocal irqbalance[783]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 01 08:47:51 np0005540741.novalocal irqbalance[783]: IRQ 27 affinity is now unmanaged
Dec 01 08:47:51 np0005540741.novalocal sshd-session[22218]: Accepted publickey for zuul from 38.102.83.114 port 54970 ssh2: RSA SHA256:wvbSSRHhGqscdLbt7uF108h9jRKS/yXgXkyPU84jbuE
Dec 01 08:47:51 np0005540741.novalocal systemd-logind[788]: New session 6 of user zuul.
Dec 01 08:47:51 np0005540741.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 01 08:47:51 np0005540741.novalocal sshd-session[22218]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 08:47:51 np0005540741.novalocal python3[22316]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD1SOTRFLynXDwtgjr8Jpb8q9uXMxfBHXlGklyuTKwPjDPXnAOML+Jen7YniVKHCDh1af/hIsfBphF1Trq0+ElA= zuul@np0005540740.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:47:52 np0005540741.novalocal sudo[22469]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kshsgxxwjmuuzqnbfzvuugykgqnmlfuq ; /usr/bin/python3'
Dec 01 08:47:52 np0005540741.novalocal sudo[22469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:47:52 np0005540741.novalocal python3[22480]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD1SOTRFLynXDwtgjr8Jpb8q9uXMxfBHXlGklyuTKwPjDPXnAOML+Jen7YniVKHCDh1af/hIsfBphF1Trq0+ElA= zuul@np0005540740.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:47:52 np0005540741.novalocal sudo[22469]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:52 np0005540741.novalocal sudo[22773]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xianpjrknzdxqxqqjeggyjmupsezhkkx ; /usr/bin/python3'
Dec 01 08:47:52 np0005540741.novalocal sudo[22773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:47:53 np0005540741.novalocal python3[22782]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005540741.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 01 08:47:53 np0005540741.novalocal useradd[22855]: new group: name=cloud-admin, GID=1002
Dec 01 08:47:53 np0005540741.novalocal useradd[22855]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 01 08:47:53 np0005540741.novalocal sudo[22773]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:53 np0005540741.novalocal sudo[22984]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylbxpnvmijssnqpyfkuwgtsrzodbprvj ; /usr/bin/python3'
Dec 01 08:47:53 np0005540741.novalocal sudo[22984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:47:53 np0005540741.novalocal python3[22991]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD1SOTRFLynXDwtgjr8Jpb8q9uXMxfBHXlGklyuTKwPjDPXnAOML+Jen7YniVKHCDh1af/hIsfBphF1Trq0+ElA= zuul@np0005540740.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 08:47:53 np0005540741.novalocal sudo[22984]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:53 np0005540741.novalocal sudo[23231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzjrgghfjvksrkdkbzyxkjlupaokofcn ; /usr/bin/python3'
Dec 01 08:47:53 np0005540741.novalocal sudo[23231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:47:54 np0005540741.novalocal python3[23244]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:47:54 np0005540741.novalocal sudo[23231]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:54 np0005540741.novalocal sudo[23477]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irmysmjpwuykgixdofebvonsigocrbjh ; /usr/bin/python3'
Dec 01 08:47:54 np0005540741.novalocal sudo[23477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:47:54 np0005540741.novalocal python3[23486]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764578873.7994027-135-188411347745223/source _original_basename=tmp4km6e_s3 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:47:54 np0005540741.novalocal sudo[23477]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:55 np0005540741.novalocal sudo[23755]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tujsqodjzrhkqriqyxmhykoxutguytpb ; /usr/bin/python3'
Dec 01 08:47:55 np0005540741.novalocal sudo[23755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:47:55 np0005540741.novalocal python3[23766]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 01 08:47:55 np0005540741.novalocal systemd[1]: Starting Hostname Service...
Dec 01 08:47:55 np0005540741.novalocal systemd[1]: Started Hostname Service.
Dec 01 08:47:55 np0005540741.novalocal systemd-hostnamed[23864]: Changed pretty hostname to 'compute-0'
Dec 01 08:47:55 compute-0 systemd-hostnamed[23864]: Hostname set to <compute-0> (static)
Dec 01 08:47:55 compute-0 NetworkManager[7186]: <info>  [1764578875.6998] hostname: static hostname changed from "np0005540741.novalocal" to "compute-0"
Dec 01 08:47:55 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 08:47:55 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 08:47:55 compute-0 sudo[23755]: pam_unix(sudo:session): session closed for user root
Dec 01 08:47:56 compute-0 sshd-session[22261]: Connection closed by 38.102.83.114 port 54970
Dec 01 08:47:56 compute-0 sshd-session[22218]: pam_unix(sshd:session): session closed for user zuul
Dec 01 08:47:56 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Dec 01 08:47:56 compute-0 systemd[1]: session-6.scope: Consumed 2.683s CPU time.
Dec 01 08:47:56 compute-0 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Dec 01 08:47:56 compute-0 systemd-logind[788]: Removed session 6.
Dec 01 08:48:05 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 08:48:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 08:48:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 08:48:13 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 3.072s CPU time.
Dec 01 08:48:13 compute-0 systemd[1]: run-rb558ef0e182540369dfba52fc1496cf3.service: Deactivated successfully.
Dec 01 08:48:25 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 08:51:06 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 01 08:51:06 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 01 08:51:07 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 01 08:51:07 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 01 08:51:27 compute-0 sshd-session[29965]: Accepted publickey for zuul from 38.102.83.177 port 60438 ssh2: RSA SHA256:wvbSSRHhGqscdLbt7uF108h9jRKS/yXgXkyPU84jbuE
Dec 01 08:51:27 compute-0 systemd-logind[788]: New session 7 of user zuul.
Dec 01 08:51:27 compute-0 systemd[1]: Started Session 7 of User zuul.
Dec 01 08:51:27 compute-0 sshd-session[29965]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 08:51:28 compute-0 python3[30041]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 08:51:29 compute-0 sudo[30155]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfbzdhgbinfghohwhfxkgpbowhqmfkoj ; /usr/bin/python3'
Dec 01 08:51:29 compute-0 sudo[30155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:29 compute-0 python3[30157]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:51:29 compute-0 sudo[30155]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:29 compute-0 sudo[30228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbefmygyvnlahhfwzssscggmrakbxhjt ; /usr/bin/python3'
Dec 01 08:51:29 compute-0 sudo[30228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:30 compute-0 python3[30230]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:51:30 compute-0 sudo[30228]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:30 compute-0 sudo[30254]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kymlwqdbmcvcruezbvckdvwhmfjutfil ; /usr/bin/python3'
Dec 01 08:51:30 compute-0 sudo[30254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:30 compute-0 python3[30256]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:51:30 compute-0 sudo[30254]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:30 compute-0 sudo[30327]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrjrqzjhomxcmgohqpnxckdgaohmyifa ; /usr/bin/python3'
Dec 01 08:51:30 compute-0 sudo[30327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:30 compute-0 python3[30329]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:51:30 compute-0 sudo[30327]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:30 compute-0 sudo[30353]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhclcgqdfrlosrxcqchnmgbqjvmcyjap ; /usr/bin/python3'
Dec 01 08:51:30 compute-0 sudo[30353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:31 compute-0 python3[30355]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:51:31 compute-0 sudo[30353]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:31 compute-0 sudo[30426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvblnuwxmxaxxvzwhiruemrfdwmlebhv ; /usr/bin/python3'
Dec 01 08:51:31 compute-0 sudo[30426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:31 compute-0 python3[30428]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:51:31 compute-0 sudo[30426]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:31 compute-0 sudo[30452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlmlrdkifkxwmbhkedqqhtcwtolbxrjs ; /usr/bin/python3'
Dec 01 08:51:31 compute-0 sudo[30452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:31 compute-0 python3[30454]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:51:31 compute-0 sudo[30452]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:31 compute-0 sudo[30525]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sexionvniepucypxnrzxaewlycpakuuv ; /usr/bin/python3'
Dec 01 08:51:31 compute-0 sudo[30525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:31 compute-0 python3[30527]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:51:31 compute-0 sudo[30525]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:32 compute-0 sudo[30551]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfqaqbcgoizekogvevfaesdhuyblzgiq ; /usr/bin/python3'
Dec 01 08:51:32 compute-0 sudo[30551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:32 compute-0 python3[30553]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:51:32 compute-0 sudo[30551]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:32 compute-0 sudo[30624]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkaktcxrtzeantqejnkomwevypgvuafz ; /usr/bin/python3'
Dec 01 08:51:32 compute-0 sudo[30624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:32 compute-0 python3[30626]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:51:32 compute-0 sudo[30624]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:32 compute-0 sudo[30650]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzyorlnnhlhhdpykinmlkamiverxdrcq ; /usr/bin/python3'
Dec 01 08:51:32 compute-0 sudo[30650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:32 compute-0 python3[30652]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:51:32 compute-0 sudo[30650]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:32 compute-0 sudo[30723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsogeelotldhwdlbgdyqnhiltdelhxbs ; /usr/bin/python3'
Dec 01 08:51:32 compute-0 sudo[30723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:33 compute-0 python3[30725]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:51:33 compute-0 sudo[30723]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:33 compute-0 sudo[30749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncxygsbhtuuewhawjsrllcnauiyvafkh ; /usr/bin/python3'
Dec 01 08:51:33 compute-0 sudo[30749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:33 compute-0 python3[30751]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 08:51:33 compute-0 sudo[30749]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:33 compute-0 sudo[30822]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywzmtiegtuezffhmzoxnevqhjozhxvek ; /usr/bin/python3'
Dec 01 08:51:33 compute-0 sudo[30822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 08:51:33 compute-0 python3[30824]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764579089.404085-33574-56968449648190/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 08:51:33 compute-0 sudo[30822]: pam_unix(sudo:session): session closed for user root
Dec 01 08:51:35 compute-0 sshd-session[30849]: Connection closed by 192.168.122.11 port 56420 [preauth]
Dec 01 08:51:35 compute-0 sshd-session[30851]: Connection closed by 192.168.122.11 port 56404 [preauth]
Dec 01 08:51:35 compute-0 sshd-session[30850]: Unable to negotiate with 192.168.122.11 port 56436: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 01 08:51:35 compute-0 sshd-session[30853]: Unable to negotiate with 192.168.122.11 port 56442: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 01 08:51:35 compute-0 sshd-session[30852]: Unable to negotiate with 192.168.122.11 port 56448: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 01 08:51:44 compute-0 python3[30882]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 08:56:44 compute-0 sshd-session[29968]: Received disconnect from 38.102.83.177 port 60438:11: disconnected by user
Dec 01 08:56:44 compute-0 sshd-session[29968]: Disconnected from user zuul 38.102.83.177 port 60438
Dec 01 08:56:44 compute-0 sshd-session[29965]: pam_unix(sshd:session): session closed for user zuul
Dec 01 08:56:44 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Dec 01 08:56:44 compute-0 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Dec 01 08:56:44 compute-0 systemd[1]: session-7.scope: Consumed 4.867s CPU time.
Dec 01 08:56:44 compute-0 systemd-logind[788]: Removed session 7.
Dec 01 09:01:01 compute-0 CROND[30890]: (root) CMD (run-parts /etc/cron.hourly)
Dec 01 09:01:01 compute-0 run-parts[30893]: (/etc/cron.hourly) starting 0anacron
Dec 01 09:01:01 compute-0 anacron[30901]: Anacron started on 2025-12-01
Dec 01 09:01:01 compute-0 anacron[30901]: Will run job `cron.daily' in 10 min.
Dec 01 09:01:01 compute-0 anacron[30901]: Will run job `cron.weekly' in 30 min.
Dec 01 09:01:01 compute-0 anacron[30901]: Will run job `cron.monthly' in 50 min.
Dec 01 09:01:01 compute-0 anacron[30901]: Jobs will be executed sequentially
Dec 01 09:01:01 compute-0 run-parts[30903]: (/etc/cron.hourly) finished 0anacron
Dec 01 09:01:01 compute-0 CROND[30889]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 01 09:04:11 compute-0 sshd-session[30905]: Accepted publickey for zuul from 192.168.122.30 port 44642 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:04:11 compute-0 systemd-logind[788]: New session 8 of user zuul.
Dec 01 09:04:11 compute-0 systemd[1]: Started Session 8 of User zuul.
Dec 01 09:04:11 compute-0 sshd-session[30905]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:04:12 compute-0 python3.9[31058]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:04:13 compute-0 sudo[31237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iizlyflynojrfwqajqtnctjdefkiqnjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579853.0061717-32-145700451257424/AnsiballZ_command.py'
Dec 01 09:04:13 compute-0 sudo[31237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:13 compute-0 python3.9[31239]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:04:21 compute-0 sudo[31237]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:21 compute-0 sshd-session[30908]: Connection closed by 192.168.122.30 port 44642
Dec 01 09:04:21 compute-0 sshd-session[30905]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:04:21 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Dec 01 09:04:21 compute-0 systemd[1]: session-8.scope: Consumed 7.872s CPU time.
Dec 01 09:04:21 compute-0 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Dec 01 09:04:21 compute-0 systemd-logind[788]: Removed session 8.
Dec 01 09:04:37 compute-0 sshd-session[31296]: Accepted publickey for zuul from 192.168.122.30 port 53838 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:04:37 compute-0 systemd-logind[788]: New session 9 of user zuul.
Dec 01 09:04:37 compute-0 systemd[1]: Started Session 9 of User zuul.
Dec 01 09:04:38 compute-0 sshd-session[31296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:04:38 compute-0 python3.9[31449]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 01 09:04:39 compute-0 python3.9[31623]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:04:40 compute-0 sudo[31773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vactmjurdlhszdvjxvpyhkbmizpdqwvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579880.260166-45-11129867041838/AnsiballZ_command.py'
Dec 01 09:04:40 compute-0 sudo[31773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:40 compute-0 python3.9[31775]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:04:40 compute-0 sudo[31773]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:41 compute-0 irqbalance[783]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 01 09:04:41 compute-0 irqbalance[783]: IRQ 26 affinity is now unmanaged
Dec 01 09:04:41 compute-0 sudo[31926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obcnkjljrjubcvkqcnfwbtroruabvurp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579881.140814-57-5551683337189/AnsiballZ_stat.py'
Dec 01 09:04:41 compute-0 sudo[31926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:41 compute-0 python3.9[31928]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:04:41 compute-0 sudo[31926]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:42 compute-0 sudo[32078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzxlfoywehzheyysegycnvyaiupqihxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579881.9618156-65-17499747513396/AnsiballZ_file.py'
Dec 01 09:04:42 compute-0 sudo[32078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:42 compute-0 python3.9[32080]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:04:42 compute-0 sudo[32078]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:42 compute-0 sudo[32230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyrejbhzemxrvwuvpnmwnlhmnrzpswom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579882.7072837-73-140268283855490/AnsiballZ_stat.py'
Dec 01 09:04:42 compute-0 sudo[32230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:43 compute-0 python3.9[32232]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:04:43 compute-0 sudo[32230]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:43 compute-0 sudo[32353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwsibbqjwdbrjzrmllevuybsgkplczww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579882.7072837-73-140268283855490/AnsiballZ_copy.py'
Dec 01 09:04:43 compute-0 sudo[32353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:43 compute-0 python3.9[32355]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764579882.7072837-73-140268283855490/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:04:43 compute-0 sudo[32353]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:44 compute-0 sudo[32505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwuohryqiunhoxdeucwxkiupeqdfmlni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579884.161278-88-99452596970789/AnsiballZ_setup.py'
Dec 01 09:04:44 compute-0 sudo[32505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:44 compute-0 python3.9[32507]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:04:44 compute-0 sudo[32505]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:45 compute-0 sudo[32661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zubjklfsxxfyyxvyueoxxbuudkdpvvro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579885.0622318-96-25085966043855/AnsiballZ_file.py'
Dec 01 09:04:45 compute-0 sudo[32661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:45 compute-0 python3.9[32663]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:04:45 compute-0 sudo[32661]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:46 compute-0 sudo[32813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqjkbqwcgkpcsihybyjeeeiinkepgorg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579885.7341511-105-87596232117236/AnsiballZ_file.py'
Dec 01 09:04:46 compute-0 sudo[32813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:46 compute-0 python3.9[32815]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:04:46 compute-0 sudo[32813]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:47 compute-0 python3.9[32965]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:04:50 compute-0 python3.9[33218]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:04:51 compute-0 python3.9[33368]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:04:52 compute-0 python3.9[33522]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:04:52 compute-0 sudo[33678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebemzfwbhvoelsdsfpthvbplpjhzddez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579892.6834724-153-95153483061242/AnsiballZ_setup.py'
Dec 01 09:04:52 compute-0 sudo[33678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:53 compute-0 python3.9[33680]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:04:53 compute-0 sudo[33678]: pam_unix(sudo:session): session closed for user root
Dec 01 09:04:53 compute-0 sudo[33762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhwpnzjkmkqldhmglunoepkbnyozktva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764579892.6834724-153-95153483061242/AnsiballZ_dnf.py'
Dec 01 09:04:53 compute-0 sudo[33762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:04:54 compute-0 python3.9[33764]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:05:40 compute-0 systemd[1]: Reloading.
Dec 01 09:05:40 compute-0 systemd-rc-local-generator[33961]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:05:40 compute-0 systemd[1]: Starting dnf makecache...
Dec 01 09:05:40 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 01 09:05:40 compute-0 dnf[33971]: Failed determining last makecache time.
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-barbican-42b4c41831408a8e323 149 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 167 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-cinder-1c00d6490d88e436f26ef 186 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-python-stevedore-c4acc5639fd2329372142 184 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-python-cloudkitty-tests-tempest-2c80f8 191 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 169 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 systemd[1]: Reloading.
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 167 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-python-designate-tests-tempest-347fdbc 196 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-glance-1fd12c29b339f30fe823e 191 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 systemd-rc-local-generator[34010]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 126 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-manila-3c01b7181572c95dac462 173 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-python-whitebox-neutron-tests-tempest- 187 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-octavia-ba397f07a7331190208c 187 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-watcher-c014f81a8647287f6dcc 181 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-ansible-config_template-5ccaa22121a7ff 187 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 194 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-swift-dc98a8463506ac520c469a 195 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-python-tempestconf-8515371b7cceebd4282 177 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 dnf[33971]: delorean-openstack-heat-ui-013accbfd179753bc3f0 196 kB/s | 3.0 kB     00:00
Dec 01 09:05:40 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 01 09:05:40 compute-0 systemd[1]: Reloading.
Dec 01 09:05:41 compute-0 dnf[33971]: CentOS Stream 9 - BaseOS                         79 kB/s | 7.3 kB     00:00
Dec 01 09:05:41 compute-0 systemd-rc-local-generator[34064]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:05:41 compute-0 dnf[33971]: CentOS Stream 9 - AppStream                      78 kB/s | 7.4 kB     00:00
Dec 01 09:05:41 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 01 09:05:41 compute-0 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec 01 09:05:41 compute-0 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec 01 09:05:41 compute-0 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec 01 09:05:41 compute-0 dnf[33971]: CentOS Stream 9 - CRB                            48 kB/s | 7.2 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: CentOS Stream 9 - Extras packages                72 kB/s | 8.3 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: dlrn-antelope-testing                           168 kB/s | 3.0 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: dlrn-antelope-build-deps                        172 kB/s | 3.0 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: centos9-rabbitmq                                113 kB/s | 3.0 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: centos9-storage                                 127 kB/s | 3.0 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: centos9-opstools                                132 kB/s | 3.0 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: NFV SIG OpenvSwitch                             142 kB/s | 3.0 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: repo-setup-centos-appstream                     210 kB/s | 4.4 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: repo-setup-centos-baseos                        173 kB/s | 3.9 kB     00:00
Dec 01 09:05:41 compute-0 dnf[33971]: repo-setup-centos-highavailability              165 kB/s | 3.9 kB     00:00
Dec 01 09:05:42 compute-0 dnf[33971]: repo-setup-centos-powertools                    183 kB/s | 4.3 kB     00:00
Dec 01 09:05:42 compute-0 dnf[33971]: Extra Packages for Enterprise Linux 9 - x86_64  238 kB/s |  30 kB     00:00
Dec 01 09:05:42 compute-0 dnf[33971]: Metadata cache created.
Dec 01 09:05:42 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 01 09:05:42 compute-0 systemd[1]: Finished dnf makecache.
Dec 01 09:05:42 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.762s CPU time.
Dec 01 09:06:45 compute-0 kernel: SELinux:  Converting 2718 SID table entries...
Dec 01 09:06:45 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:06:45 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:06:45 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:06:45 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:06:45 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:06:45 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:06:45 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:06:46 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 01 09:06:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:06:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:06:46 compute-0 systemd[1]: Reloading.
Dec 01 09:06:46 compute-0 systemd-rc-local-generator[34428]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:06:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:06:47 compute-0 sudo[33762]: pam_unix(sudo:session): session closed for user root
Dec 01 09:06:47 compute-0 sudo[35337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aakmhpqiflpmnqcijondlzmzrzkwimtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580007.3151798-165-183380349944333/AnsiballZ_command.py'
Dec 01 09:06:47 compute-0 sudo[35337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:06:47 compute-0 python3.9[35339]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:06:47 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:06:47 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:06:47 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.155s CPU time.
Dec 01 09:06:47 compute-0 systemd[1]: run-r5f1981fab399474899468b8fa3ca5782.service: Deactivated successfully.
Dec 01 09:06:48 compute-0 sudo[35337]: pam_unix(sudo:session): session closed for user root
Dec 01 09:06:49 compute-0 sudo[35619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mawaqfpvwacvfhszecothbzfbczqlpim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580008.8828132-173-280374230800346/AnsiballZ_selinux.py'
Dec 01 09:06:49 compute-0 sudo[35619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:06:49 compute-0 python3.9[35621]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 01 09:06:49 compute-0 sudo[35619]: pam_unix(sudo:session): session closed for user root
Dec 01 09:06:50 compute-0 sudo[35771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlhtbqieejelvxhnoeyyxrjgbqzizqvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580010.1850998-184-198925151516876/AnsiballZ_command.py'
Dec 01 09:06:50 compute-0 sudo[35771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:06:50 compute-0 python3.9[35773]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 01 09:06:51 compute-0 sudo[35771]: pam_unix(sudo:session): session closed for user root
Dec 01 09:06:51 compute-0 sudo[35924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pngmoopscdxrgdcteoqabvdlyrfvjahe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580011.7289023-192-49874263921887/AnsiballZ_file.py'
Dec 01 09:06:51 compute-0 sudo[35924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:06:55 compute-0 python3.9[35926]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:06:55 compute-0 sudo[35924]: pam_unix(sudo:session): session closed for user root
Dec 01 09:06:56 compute-0 sudo[36076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztberzkwjaksavsrekgbquahsdteyndm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580015.609572-200-191736012090800/AnsiballZ_mount.py'
Dec 01 09:06:56 compute-0 sudo[36076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:06:56 compute-0 python3.9[36078]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 01 09:06:56 compute-0 sudo[36076]: pam_unix(sudo:session): session closed for user root
Dec 01 09:06:57 compute-0 sudo[36229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unmqpytpjjzxoufpbxletzndjilwdyih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580017.1641555-228-234989626245106/AnsiballZ_file.py'
Dec 01 09:06:57 compute-0 sudo[36229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:06:59 compute-0 python3.9[36231]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:06:59 compute-0 sudo[36229]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:00 compute-0 sudo[36381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imdopfkpiahvhotpeajyfklykkgfgtwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580019.8627565-236-260708985041698/AnsiballZ_stat.py'
Dec 01 09:07:00 compute-0 sudo[36381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:02 compute-0 python3.9[36383]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:07:02 compute-0 sudo[36381]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:02 compute-0 sudo[36504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpxqaojzyhfeozcdbwsmtuhysvxoxhjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580019.8627565-236-260708985041698/AnsiballZ_copy.py'
Dec 01 09:07:02 compute-0 sudo[36504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:03 compute-0 python3.9[36506]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580019.8627565-236-260708985041698/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:07:03 compute-0 sudo[36504]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:04 compute-0 sudo[36656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsqzxdnenevcgzfwcuczeqfcwkoxlprr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580023.6210258-260-78454880436405/AnsiballZ_stat.py'
Dec 01 09:07:04 compute-0 sudo[36656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:04 compute-0 python3.9[36658]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:07:04 compute-0 sudo[36656]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:04 compute-0 sudo[36808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jozgyuytwkznotvvkyjzeleagwducdee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580024.3886695-268-229068653187922/AnsiballZ_command.py'
Dec 01 09:07:04 compute-0 sudo[36808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:04 compute-0 python3.9[36810]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:07:04 compute-0 sudo[36808]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:05 compute-0 sudo[36961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lokbrhkfuwazgrlcxoqmuotxhdaeohpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580025.3480809-276-133147602027381/AnsiballZ_file.py'
Dec 01 09:07:05 compute-0 sudo[36961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:05 compute-0 python3.9[36963]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:07:05 compute-0 sudo[36961]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:06 compute-0 sudo[37113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woapkoyruoblwvrcadsebrfbgkydevnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580026.2370815-287-176304497917325/AnsiballZ_getent.py'
Dec 01 09:07:06 compute-0 sudo[37113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:06 compute-0 python3.9[37115]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 01 09:07:06 compute-0 sudo[37113]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:06 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:07:07 compute-0 sudo[37267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mofofeztyekdcpbvcuhdbrmrraaokada ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580027.082136-295-262946025021213/AnsiballZ_group.py'
Dec 01 09:07:07 compute-0 sudo[37267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:07 compute-0 python3.9[37269]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:07:07 compute-0 groupadd[37270]: group added to /etc/group: name=qemu, GID=107
Dec 01 09:07:07 compute-0 groupadd[37270]: group added to /etc/gshadow: name=qemu
Dec 01 09:07:07 compute-0 groupadd[37270]: new group: name=qemu, GID=107
Dec 01 09:07:07 compute-0 sudo[37267]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:08 compute-0 sudo[37425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvjdryoyylfoatxikvawnxuxsfghkaov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580027.944746-303-66825337188734/AnsiballZ_user.py'
Dec 01 09:07:08 compute-0 sudo[37425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:08 compute-0 python3.9[37427]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 09:07:08 compute-0 useradd[37429]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 01 09:07:08 compute-0 sudo[37425]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:09 compute-0 sudo[37585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiogkwdynpmllvjwhintwrofvcprbkkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580028.9801853-311-188175524879054/AnsiballZ_getent.py'
Dec 01 09:07:09 compute-0 sudo[37585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:09 compute-0 python3.9[37587]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 01 09:07:09 compute-0 sudo[37585]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:09 compute-0 sudo[37738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvgrktmnrfrmshhiqlhbawadxsoxzhet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580029.6058815-319-105447553652226/AnsiballZ_group.py'
Dec 01 09:07:09 compute-0 sudo[37738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:10 compute-0 python3.9[37740]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:07:10 compute-0 groupadd[37741]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 01 09:07:10 compute-0 groupadd[37741]: group added to /etc/gshadow: name=hugetlbfs
Dec 01 09:07:10 compute-0 groupadd[37741]: new group: name=hugetlbfs, GID=42477
Dec 01 09:07:10 compute-0 sudo[37738]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:10 compute-0 sudo[37896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnkxsvmmatjyntfqbfjbdyegtiwmpiuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580030.2708063-328-198388527255129/AnsiballZ_file.py'
Dec 01 09:07:10 compute-0 sudo[37896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:10 compute-0 python3.9[37898]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 01 09:07:10 compute-0 sudo[37896]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:11 compute-0 sudo[38048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pierczgfsbupdulrdtkicssnfwggilnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580031.0689487-339-185940121825551/AnsiballZ_dnf.py'
Dec 01 09:07:11 compute-0 sudo[38048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:11 compute-0 python3.9[38050]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:07:13 compute-0 sudo[38048]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:13 compute-0 sudo[38201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohvfipeujkunzafqftfsliobaifuhblw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580033.5374057-347-133930974351083/AnsiballZ_file.py'
Dec 01 09:07:13 compute-0 sudo[38201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:14 compute-0 python3.9[38203]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:07:14 compute-0 sudo[38201]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:14 compute-0 sudo[38353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjsyimmsgopfxfuqpdtlcxlxvlvevetc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580034.1952848-355-214382523928274/AnsiballZ_stat.py'
Dec 01 09:07:14 compute-0 sudo[38353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:14 compute-0 python3.9[38355]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:07:14 compute-0 sudo[38353]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:14 compute-0 sudo[38476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvgjelrxzbymnryycgwpwenpuvdgrxen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580034.1952848-355-214382523928274/AnsiballZ_copy.py'
Dec 01 09:07:14 compute-0 sudo[38476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:15 compute-0 python3.9[38478]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580034.1952848-355-214382523928274/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:07:15 compute-0 sudo[38476]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:15 compute-0 sudo[38628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvijlxzkberavhuhgbjdibztgyajmhoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580035.3156111-370-151734810915839/AnsiballZ_systemd.py'
Dec 01 09:07:15 compute-0 sudo[38628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:16 compute-0 python3.9[38630]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:07:16 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 01 09:07:16 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 01 09:07:16 compute-0 kernel: Bridge firewalling registered
Dec 01 09:07:16 compute-0 systemd-modules-load[38634]: Inserted module 'br_netfilter'
Dec 01 09:07:16 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 01 09:07:16 compute-0 sudo[38628]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:16 compute-0 sudo[38787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhrudguyhghrcbjcmujowrswrekkirsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580036.5182004-378-59291590312498/AnsiballZ_stat.py'
Dec 01 09:07:16 compute-0 sudo[38787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:16 compute-0 python3.9[38789]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:07:16 compute-0 sudo[38787]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:17 compute-0 sudo[38910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sirxlrqhhgduhlptyyjumrzbotaarvnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580036.5182004-378-59291590312498/AnsiballZ_copy.py'
Dec 01 09:07:17 compute-0 sudo[38910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:17 compute-0 python3.9[38912]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580036.5182004-378-59291590312498/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:07:17 compute-0 sudo[38910]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:18 compute-0 sudo[39062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdxqotawchkjgoyngsiubuzuzipibnwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580037.8869584-396-76451992900468/AnsiballZ_dnf.py'
Dec 01 09:07:18 compute-0 sudo[39062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:18 compute-0 python3.9[39064]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:07:21 compute-0 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec 01 09:07:21 compute-0 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec 01 09:07:22 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:07:22 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:07:22 compute-0 systemd[1]: Reloading.
Dec 01 09:07:22 compute-0 systemd-rc-local-generator[39126]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:07:22 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:07:23 compute-0 sudo[39062]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:24 compute-0 python3.9[41104]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:07:24 compute-0 python3.9[42101]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 01 09:07:25 compute-0 python3.9[42771]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:07:26 compute-0 sudo[43221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iidgssufxqpqkwktuffztlognlchghdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580045.7809196-435-2540676620381/AnsiballZ_command.py'
Dec 01 09:07:26 compute-0 sudo[43221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:26 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:07:26 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:07:26 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.733s CPU time.
Dec 01 09:07:26 compute-0 systemd[1]: run-r9f23171367874ed690c57a11302e9c11.service: Deactivated successfully.
Dec 01 09:07:26 compute-0 python3.9[43223]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:07:26 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 01 09:07:26 compute-0 systemd[1]: Starting Authorization Manager...
Dec 01 09:07:26 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 01 09:07:26 compute-0 polkitd[43441]: Started polkitd version 0.117
Dec 01 09:07:26 compute-0 polkitd[43441]: Loading rules from directory /etc/polkit-1/rules.d
Dec 01 09:07:26 compute-0 polkitd[43441]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 01 09:07:26 compute-0 polkitd[43441]: Finished loading, compiling and executing 2 rules
Dec 01 09:07:26 compute-0 systemd[1]: Started Authorization Manager.
Dec 01 09:07:26 compute-0 polkitd[43441]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 01 09:07:27 compute-0 sudo[43221]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:27 compute-0 sudo[43609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpnsipwamztxzrbkarxqsgyygqzduvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580047.270429-444-39476346435101/AnsiballZ_systemd.py'
Dec 01 09:07:27 compute-0 sudo[43609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:27 compute-0 python3.9[43611]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:07:27 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 01 09:07:27 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 01 09:07:27 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 01 09:07:27 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 01 09:07:28 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 01 09:07:28 compute-0 sudo[43609]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:28 compute-0 python3.9[43773]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 01 09:07:30 compute-0 sudo[43923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcdlqwoytugfwgiundbjbkswjpdiiaub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580050.3850818-501-113440284963812/AnsiballZ_systemd.py'
Dec 01 09:07:30 compute-0 sudo[43923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:30 compute-0 python3.9[43925]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:07:31 compute-0 systemd[1]: Reloading.
Dec 01 09:07:31 compute-0 systemd-rc-local-generator[43955]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:07:31 compute-0 sudo[43923]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:31 compute-0 sudo[44112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrolbdjtgnglboksszdyetqqfgmmhorw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580051.3705351-501-25072450703878/AnsiballZ_systemd.py'
Dec 01 09:07:31 compute-0 sudo[44112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:32 compute-0 python3.9[44114]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:07:32 compute-0 systemd[1]: Reloading.
Dec 01 09:07:32 compute-0 systemd-rc-local-generator[44142]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:07:33 compute-0 sudo[44112]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:33 compute-0 sudo[44300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pofpijtdqagqdqvnhnhafcmdpcofrxaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580053.3719814-517-112704118046101/AnsiballZ_command.py'
Dec 01 09:07:33 compute-0 sudo[44300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:33 compute-0 python3.9[44302]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:07:33 compute-0 sudo[44300]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:34 compute-0 sudo[44453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxunowuhoejeszcpvjnonlphkysdjgds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580054.084185-525-226479889029294/AnsiballZ_command.py'
Dec 01 09:07:34 compute-0 sudo[44453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:34 compute-0 python3.9[44455]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:07:34 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 01 09:07:34 compute-0 sudo[44453]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:34 compute-0 sudo[44606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnwivucxgtjnxurwazstrrthlaccomqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580054.657981-533-107961841947596/AnsiballZ_command.py'
Dec 01 09:07:34 compute-0 sudo[44606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:35 compute-0 python3.9[44608]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:07:36 compute-0 sudo[44606]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:37 compute-0 sudo[44768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dobvizlegzftegkqkhyyofsrnacscuye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580056.8581598-541-229243080882126/AnsiballZ_command.py'
Dec 01 09:07:37 compute-0 sudo[44768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:37 compute-0 python3.9[44770]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:07:37 compute-0 sudo[44768]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:37 compute-0 sudo[44921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iulflcfbbyuexbticqbwxbetbcmxecuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580057.5112915-549-16029412839386/AnsiballZ_systemd.py'
Dec 01 09:07:37 compute-0 sudo[44921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:38 compute-0 python3.9[44923]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:07:38 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 01 09:07:38 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Dec 01 09:07:38 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Dec 01 09:07:38 compute-0 systemd[1]: Starting Apply Kernel Variables...
Dec 01 09:07:38 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 01 09:07:38 compute-0 systemd[1]: Finished Apply Kernel Variables.
Dec 01 09:07:38 compute-0 sudo[44921]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:38 compute-0 sshd-session[31299]: Connection closed by 192.168.122.30 port 53838
Dec 01 09:07:38 compute-0 sshd-session[31296]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:07:38 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Dec 01 09:07:38 compute-0 systemd[1]: session-9.scope: Consumed 2min 11.559s CPU time.
Dec 01 09:07:38 compute-0 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Dec 01 09:07:38 compute-0 systemd-logind[788]: Removed session 9.
Dec 01 09:07:45 compute-0 sshd-session[44953]: Accepted publickey for zuul from 192.168.122.30 port 57622 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:07:45 compute-0 systemd-logind[788]: New session 10 of user zuul.
Dec 01 09:07:45 compute-0 systemd[1]: Started Session 10 of User zuul.
Dec 01 09:07:45 compute-0 sshd-session[44953]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:07:46 compute-0 python3.9[45106]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:07:47 compute-0 sudo[45260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdtscpfnxauxejduxyfxtcpomxabvwet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580066.6176674-36-146086910104353/AnsiballZ_getent.py'
Dec 01 09:07:47 compute-0 sudo[45260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:47 compute-0 python3.9[45262]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 01 09:07:47 compute-0 sudo[45260]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:47 compute-0 sudo[45413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyzipwmupdtwkcanntaechmkzaowyvpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580067.4895966-44-272515854219490/AnsiballZ_group.py'
Dec 01 09:07:47 compute-0 sudo[45413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:48 compute-0 python3.9[45415]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:07:48 compute-0 groupadd[45416]: group added to /etc/group: name=openvswitch, GID=42476
Dec 01 09:07:48 compute-0 groupadd[45416]: group added to /etc/gshadow: name=openvswitch
Dec 01 09:07:48 compute-0 groupadd[45416]: new group: name=openvswitch, GID=42476
Dec 01 09:07:48 compute-0 sudo[45413]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:48 compute-0 sudo[45571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmzeuchzkffjmtdxaylxehhsiamgtasx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580068.3803637-52-195656632897904/AnsiballZ_user.py'
Dec 01 09:07:48 compute-0 sudo[45571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:49 compute-0 python3.9[45573]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 09:07:49 compute-0 useradd[45575]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 01 09:07:49 compute-0 useradd[45575]: add 'openvswitch' to group 'hugetlbfs'
Dec 01 09:07:49 compute-0 useradd[45575]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 01 09:07:49 compute-0 sudo[45571]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:49 compute-0 sudo[45731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqdjtcqsgpgbhmfpubendfqghrblrkpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580069.4782865-62-134927168813773/AnsiballZ_setup.py'
Dec 01 09:07:49 compute-0 sudo[45731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:50 compute-0 python3.9[45733]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:07:50 compute-0 sudo[45731]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:50 compute-0 sudo[45815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wggidfxjaqftnqnzshalesmeuizcbwtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580069.4782865-62-134927168813773/AnsiballZ_dnf.py'
Dec 01 09:07:50 compute-0 sudo[45815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:50 compute-0 python3.9[45817]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:07:52 compute-0 sudo[45815]: pam_unix(sudo:session): session closed for user root
Dec 01 09:07:53 compute-0 sudo[45981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhftgtxmexkvwvmwnmappwgbkwkunnpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580073.1625178-76-266826133350965/AnsiballZ_dnf.py'
Dec 01 09:07:53 compute-0 sudo[45981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:07:53 compute-0 python3.9[45983]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:08:04 compute-0 kernel: SELinux:  Converting 2730 SID table entries...
Dec 01 09:08:04 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:08:04 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:08:04 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:08:04 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:08:04 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:08:04 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:08:04 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:08:04 compute-0 groupadd[46006]: group added to /etc/group: name=unbound, GID=993
Dec 01 09:08:04 compute-0 groupadd[46006]: group added to /etc/gshadow: name=unbound
Dec 01 09:08:04 compute-0 groupadd[46006]: new group: name=unbound, GID=993
Dec 01 09:08:04 compute-0 useradd[46013]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 01 09:08:05 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 01 09:08:05 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 01 09:08:06 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:08:06 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:08:06 compute-0 systemd[1]: Reloading.
Dec 01 09:08:06 compute-0 systemd-rc-local-generator[46512]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:08:06 compute-0 systemd-sysv-generator[46516]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:08:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:08:06 compute-0 sudo[45981]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:06 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:08:06 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:08:06 compute-0 systemd[1]: run-rbfc7a711b29743839570d8236289c43f.service: Deactivated successfully.
Dec 01 09:08:07 compute-0 sudo[47078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwnyuvrumhvuytaqbtxjqkygcnvnrbnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580087.0601833-84-134888923311105/AnsiballZ_systemd.py'
Dec 01 09:08:07 compute-0 sudo[47078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:07 compute-0 python3.9[47080]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:08:08 compute-0 systemd[1]: Reloading.
Dec 01 09:08:08 compute-0 systemd-rc-local-generator[47110]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:08:08 compute-0 systemd-sysv-generator[47115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:08:08 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Dec 01 09:08:08 compute-0 chown[47122]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 01 09:08:08 compute-0 ovs-ctl[47127]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 01 09:08:08 compute-0 ovs-ctl[47127]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 01 09:08:08 compute-0 ovs-ctl[47127]: Starting ovsdb-server [  OK  ]
Dec 01 09:08:08 compute-0 ovs-vsctl[47176]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 01 09:08:08 compute-0 ovs-vsctl[47196]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"a8013a17-6378-4c2f-a5de-9d3b29c7a42e\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 01 09:08:08 compute-0 ovs-ctl[47127]: Configuring Open vSwitch system IDs [  OK  ]
Dec 01 09:08:08 compute-0 ovs-ctl[47127]: Enabling remote OVSDB managers [  OK  ]
Dec 01 09:08:08 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Dec 01 09:08:08 compute-0 ovs-vsctl[47202]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 01 09:08:08 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 01 09:08:08 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 01 09:08:08 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 01 09:08:08 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Dec 01 09:08:08 compute-0 ovs-ctl[47246]: Inserting openvswitch module [  OK  ]
Dec 01 09:08:08 compute-0 ovs-ctl[47215]: Starting ovs-vswitchd [  OK  ]
Dec 01 09:08:08 compute-0 ovs-vsctl[47263]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 01 09:08:08 compute-0 ovs-ctl[47215]: Enabling remote OVSDB managers [  OK  ]
Dec 01 09:08:08 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 01 09:08:08 compute-0 systemd[1]: Starting Open vSwitch...
Dec 01 09:08:08 compute-0 systemd[1]: Finished Open vSwitch.
Dec 01 09:08:08 compute-0 sudo[47078]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:09 compute-0 python3.9[47415]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:08:10 compute-0 sudo[47565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ansxiqbsssvkklcpbgulkmatnrsoludz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580089.7862496-102-8016308065921/AnsiballZ_sefcontext.py'
Dec 01 09:08:10 compute-0 sudo[47565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:10 compute-0 python3.9[47567]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 01 09:08:11 compute-0 kernel: SELinux:  Converting 2744 SID table entries...
Dec 01 09:08:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:08:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:08:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:08:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:08:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:08:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:08:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:08:11 compute-0 sudo[47565]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:12 compute-0 python3.9[47722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:08:13 compute-0 sudo[47878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oykacfoduwqtzigdizcdcrvypnredkds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580092.984358-120-273758228149009/AnsiballZ_dnf.py'
Dec 01 09:08:13 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 01 09:08:13 compute-0 sudo[47878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:13 compute-0 python3.9[47880]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:08:14 compute-0 sudo[47878]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:15 compute-0 sudo[48031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcuclfwvakbwlkxctkdoiomihjuhoeio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580094.8554425-128-167200405063832/AnsiballZ_command.py'
Dec 01 09:08:15 compute-0 sudo[48031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:15 compute-0 python3.9[48033]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:08:16 compute-0 sudo[48031]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:16 compute-0 sudo[48318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajkrgdddsjxslpjvuftvbqtvyiieblno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580096.3639662-136-112662412719801/AnsiballZ_file.py'
Dec 01 09:08:16 compute-0 sudo[48318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:17 compute-0 python3.9[48320]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 09:08:17 compute-0 sudo[48318]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:17 compute-0 python3.9[48470]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:08:18 compute-0 sudo[48622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnxtjmazrzetmpoxikscsgjilzxkigni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580098.1433241-152-212369621848287/AnsiballZ_dnf.py'
Dec 01 09:08:18 compute-0 sudo[48622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:18 compute-0 python3.9[48624]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:08:20 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:08:20 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:08:20 compute-0 systemd[1]: Reloading.
Dec 01 09:08:20 compute-0 systemd-sysv-generator[48667]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:08:20 compute-0 systemd-rc-local-generator[48663]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:08:20 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:08:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:08:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:08:21 compute-0 systemd[1]: run-r5f25536f56ca45d8a59d64e2c8369290.service: Deactivated successfully.
Dec 01 09:08:21 compute-0 sudo[48622]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:21 compute-0 sudo[48939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meszxatbqktzcqkhgxrikkvercasmwkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580101.395779-160-15176499961974/AnsiballZ_systemd.py'
Dec 01 09:08:21 compute-0 sudo[48939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:21 compute-0 python3.9[48941]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:08:22 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 01 09:08:22 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Dec 01 09:08:22 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Dec 01 09:08:22 compute-0 systemd[1]: Stopping Network Manager...
Dec 01 09:08:22 compute-0 NetworkManager[7186]: <info>  [1764580102.0122] caught SIGTERM, shutting down normally.
Dec 01 09:08:22 compute-0 NetworkManager[7186]: <info>  [1764580102.0139] dhcp4 (eth0): canceled DHCP transaction
Dec 01 09:08:22 compute-0 NetworkManager[7186]: <info>  [1764580102.0139] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:08:22 compute-0 NetworkManager[7186]: <info>  [1764580102.0139] dhcp4 (eth0): state changed no lease
Dec 01 09:08:22 compute-0 NetworkManager[7186]: <info>  [1764580102.0142] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 09:08:22 compute-0 NetworkManager[7186]: <info>  [1764580102.0226] exiting (success)
Dec 01 09:08:22 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:08:22 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:08:22 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 01 09:08:22 compute-0 systemd[1]: Stopped Network Manager.
Dec 01 09:08:22 compute-0 systemd[1]: NetworkManager.service: Consumed 11.459s CPU time, 4.1M memory peak, read 0B from disk, written 21.5K to disk.
Dec 01 09:08:22 compute-0 systemd[1]: Starting Network Manager...
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.1134] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:fbf967f0-219c-4ceb-b589-3e4f3756d2b4)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.1136] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.1185] manager[0x55bcd234d090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 01 09:08:22 compute-0 systemd[1]: Starting Hostname Service...
Dec 01 09:08:22 compute-0 systemd[1]: Started Hostname Service.
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2230] hostname: hostname: using hostnamed
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2231] hostname: static hostname changed from (none) to "compute-0"
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2238] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2244] manager[0x55bcd234d090]: rfkill: Wi-Fi hardware radio set enabled
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2244] manager[0x55bcd234d090]: rfkill: WWAN hardware radio set enabled
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2276] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2289] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2289] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2290] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2291] manager: Networking is enabled by state file
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2293] settings: Loaded settings plugin: keyfile (internal)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2297] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2326] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2337] dhcp: init: Using DHCP client 'internal'
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2340] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2346] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2352] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2361] device (lo): Activation: starting connection 'lo' (d32b959f-25b9-49e2-b1c9-8c743b9b7f56)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2368] device (eth0): carrier: link connected
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2373] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2379] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2379] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2387] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2394] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2400] device (eth1): carrier: link connected
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2405] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2410] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8) (indicated)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2411] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2417] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2424] device (eth1): Activation: starting connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec 01 09:08:22 compute-0 systemd[1]: Started Network Manager.
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2435] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2442] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2445] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2447] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2449] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2452] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2454] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2457] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2461] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2479] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2482] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2488] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2500] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2507] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2509] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2512] device (lo): Activation: successful, device activated.
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2519] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2526] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 01 09:08:22 compute-0 systemd[1]: Starting Network Manager Wait Online...
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2739] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 sudo[48939]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2791] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2807] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2824] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2840] device (eth1): Activation: successful, device activated.
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2902] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2908] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2919] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2932] device (eth0): Activation: successful, device activated.
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2942] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 01 09:08:22 compute-0 NetworkManager[48954]: <info>  [1764580102.2952] manager: startup complete
Dec 01 09:08:22 compute-0 systemd[1]: Finished Network Manager Wait Online.
Dec 01 09:08:22 compute-0 sudo[49166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiazrznxyjbrrgxatcwtpbdzhjkihcxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580102.4545271-168-186244879476139/AnsiballZ_dnf.py'
Dec 01 09:08:22 compute-0 sudo[49166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:22 compute-0 python3.9[49168]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:08:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:08:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:08:27 compute-0 systemd[1]: Reloading.
Dec 01 09:08:27 compute-0 systemd-rc-local-generator[49217]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:08:27 compute-0 systemd-sysv-generator[49223]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:08:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:08:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:08:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:08:28 compute-0 systemd[1]: run-rd9a7ddd98084426fa88375df5b2bc6b4.service: Deactivated successfully.
Dec 01 09:08:28 compute-0 sudo[49166]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:29 compute-0 sudo[49625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqgavandgervocsqllezghboftnzhown ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580108.8844175-180-227677073414972/AnsiballZ_stat.py'
Dec 01 09:08:29 compute-0 sudo[49625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:29 compute-0 python3.9[49627]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:08:29 compute-0 sudo[49625]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:29 compute-0 sudo[49777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qagggitmrrdrkimimkvstoyegecklkmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580109.537063-189-187506678065078/AnsiballZ_ini_file.py'
Dec 01 09:08:29 compute-0 sudo[49777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:30 compute-0 python3.9[49779]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:30 compute-0 sudo[49777]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:30 compute-0 sudo[49931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zprcuhtixhwsbuvnipmbjupjmkvcjhyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580110.412092-199-141484479380081/AnsiballZ_ini_file.py'
Dec 01 09:08:30 compute-0 sudo[49931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:30 compute-0 python3.9[49933]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:30 compute-0 sudo[49931]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:31 compute-0 sudo[50083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzdoiugnnyaltyhkkkkrxkopadhlstwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580110.9931517-199-82300611561512/AnsiballZ_ini_file.py'
Dec 01 09:08:31 compute-0 sudo[50083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:31 compute-0 python3.9[50085]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:31 compute-0 sudo[50083]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:32 compute-0 sudo[50235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjywqyqjxylgixvsismzvthjxlidpepo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580111.8481991-214-225427936849842/AnsiballZ_ini_file.py'
Dec 01 09:08:32 compute-0 sudo[50235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:32 compute-0 python3.9[50237]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:32 compute-0 sudo[50235]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:32 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:08:32 compute-0 sudo[50387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdfcgkasjcwotecjhoaitaaaomftlray ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580112.4810522-214-246921150973444/AnsiballZ_ini_file.py'
Dec 01 09:08:32 compute-0 sudo[50387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:32 compute-0 python3.9[50389]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:32 compute-0 sudo[50387]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:33 compute-0 sudo[50539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwhirpztvgohvojuyqtpbbtmthmtygdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580113.0771518-229-118631452750966/AnsiballZ_stat.py'
Dec 01 09:08:33 compute-0 sudo[50539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:33 compute-0 python3.9[50541]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:08:33 compute-0 sudo[50539]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:34 compute-0 sudo[50662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-citfjdqwbdpznhslsdhldegszrlhzphq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580113.0771518-229-118631452750966/AnsiballZ_copy.py'
Dec 01 09:08:34 compute-0 sudo[50662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:34 compute-0 python3.9[50664]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580113.0771518-229-118631452750966/.source _original_basename=.7bqdm2kk follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:34 compute-0 sudo[50662]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:34 compute-0 sudo[50814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kccfdoixgvybuezlnvzejzrgxvelkeoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580114.3624692-244-40463086660651/AnsiballZ_file.py'
Dec 01 09:08:34 compute-0 sudo[50814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:34 compute-0 python3.9[50816]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:34 compute-0 sudo[50814]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:35 compute-0 sudo[50966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxvnhoontyxwlkshdsfrvmlpklmyrpxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580115.0213766-252-137912306201000/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 01 09:08:35 compute-0 sudo[50966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:35 compute-0 python3.9[50968]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 01 09:08:35 compute-0 sudo[50966]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:36 compute-0 sudo[51118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcvwvvpqknewchdulfnsxryktchefvtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580115.8524637-261-15926119456396/AnsiballZ_file.py'
Dec 01 09:08:36 compute-0 sudo[51118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:36 compute-0 python3.9[51120]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:36 compute-0 sudo[51118]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:36 compute-0 sudo[51270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgthfrndcctcpifhbxlltghinhhydkmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580116.6069558-271-240311507630218/AnsiballZ_stat.py'
Dec 01 09:08:36 compute-0 sudo[51270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:37 compute-0 sudo[51270]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:37 compute-0 sudo[51393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzybkvqlcexjrnetdpxypqwhmepnypzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580116.6069558-271-240311507630218/AnsiballZ_copy.py'
Dec 01 09:08:37 compute-0 sudo[51393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:37 compute-0 sudo[51393]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:38 compute-0 sudo[51545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shsywiirhhcltrkyuokkiocwgfszfssn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580117.7174911-286-12327459403700/AnsiballZ_slurp.py'
Dec 01 09:08:38 compute-0 sudo[51545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:38 compute-0 python3.9[51547]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 01 09:08:38 compute-0 sudo[51545]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:39 compute-0 sudo[51720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdibnyjxdslmumiocvdhrzmehfaykzua ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580118.5534203-295-127613937801440/async_wrapper.py j688281423766 300 /home/zuul/.ansible/tmp/ansible-tmp-1764580118.5534203-295-127613937801440/AnsiballZ_edpm_os_net_config.py _'
Dec 01 09:08:39 compute-0 sudo[51720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:39 compute-0 ansible-async_wrapper.py[51722]: Invoked with j688281423766 300 /home/zuul/.ansible/tmp/ansible-tmp-1764580118.5534203-295-127613937801440/AnsiballZ_edpm_os_net_config.py _
Dec 01 09:08:39 compute-0 ansible-async_wrapper.py[51725]: Starting module and watcher
Dec 01 09:08:39 compute-0 ansible-async_wrapper.py[51725]: Start watching 51726 (300)
Dec 01 09:08:39 compute-0 ansible-async_wrapper.py[51726]: Start module (51726)
Dec 01 09:08:39 compute-0 ansible-async_wrapper.py[51722]: Return async_wrapper task started.
Dec 01 09:08:39 compute-0 sudo[51720]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:39 compute-0 python3.9[51727]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 01 09:08:40 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 01 09:08:40 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 01 09:08:40 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 01 09:08:40 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 01 09:08:40 compute-0 kernel: cfg80211: failed to load regulatory.db
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.4708] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.4722] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5144] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5145] audit: op="connection-add" uuid="6ff9277f-1405-47a9-9bd1-aea2fd3b8890" name="br-ex-br" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5157] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5158] audit: op="connection-add" uuid="b7d75a34-a156-4703-acf7-0960e9a4a3a8" name="br-ex-port" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5172] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5174] audit: op="connection-add" uuid="8ba334ab-5c46-407d-84e6-a38a95437ad2" name="eth1-port" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5185] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5187] audit: op="connection-add" uuid="8ad1b859-1982-4519-beea-586c02c240f1" name="vlan20-port" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5198] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5200] audit: op="connection-add" uuid="6415e037-1493-4144-b9b8-d12d21267004" name="vlan21-port" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5209] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5210] audit: op="connection-add" uuid="ce313015-bf7a-4e58-b576-c92d665db680" name="vlan22-port" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5220] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5221] audit: op="connection-add" uuid="f70ec55e-8747-47cb-9d67-31105dd4392b" name="vlan23-port" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5239] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,connection.autoconnect-priority,connection.timestamp" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5253] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5254] audit: op="connection-add" uuid="64a49a94-4c59-4bd3-bee9-d0d6482501e6" name="br-ex-if" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5293] audit: op="connection-update" uuid="178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8" name="ci-private-network" args="ipv4.never-default,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv6.dns,ipv6.method,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.routes,ovs-interface.type,ovs-external-ids.data,connection.port-type,connection.slave-type,connection.controller,connection.master,connection.timestamp" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5308] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5309] audit: op="connection-add" uuid="3f81cbde-6e8a-48c2-917a-ed56ef6b1b23" name="vlan20-if" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5322] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5323] audit: op="connection-add" uuid="099afefb-306c-4b88-8220-cc7739999342" name="vlan21-if" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5336] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5337] audit: op="connection-add" uuid="e2eeff41-c3c2-496d-8f47-3776c5e2de71" name="vlan22-if" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5351] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5352] audit: op="connection-add" uuid="b3a395d1-9bfe-4345-8be8-1087ccd4ef3f" name="vlan23-if" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5362] audit: op="connection-delete" uuid="c6b7d4f6-4237-35c7-90cb-622f3da1d185" name="Wired connection 1" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5370] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5378] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5381] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (6ff9277f-1405-47a9-9bd1-aea2fd3b8890)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5381] audit: op="connection-activate" uuid="6ff9277f-1405-47a9-9bd1-aea2fd3b8890" name="br-ex-br" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5382] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5387] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5389] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b7d75a34-a156-4703-acf7-0960e9a4a3a8)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5391] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5395] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5398] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (8ba334ab-5c46-407d-84e6-a38a95437ad2)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5399] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5404] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5406] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (8ad1b859-1982-4519-beea-586c02c240f1)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5407] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5412] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5415] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (6415e037-1493-4144-b9b8-d12d21267004)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5416] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5420] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5423] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ce313015-bf7a-4e58-b576-c92d665db680)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5425] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5429] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5432] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (f70ec55e-8747-47cb-9d67-31105dd4392b)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5432] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5434] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5435] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5439] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5442] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5445] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (64a49a94-4c59-4bd3-bee9-d0d6482501e6)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5446] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5448] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5449] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5449] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5450] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5459] device (eth1): disconnecting for new activation request.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5460] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5462] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5463] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5464] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5466] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5469] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5473] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (3f81cbde-6e8a-48c2-917a-ed56ef6b1b23)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5473] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5475] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5476] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5477] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5479] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5482] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5484] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (099afefb-306c-4b88-8220-cc7739999342)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5485] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5487] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5488] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5489] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5490] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5493] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5496] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (e2eeff41-c3c2-496d-8f47-3776c5e2de71)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5496] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5498] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5499] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5500] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5502] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5507] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5509] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (b3a395d1-9bfe-4345-8be8-1087ccd4ef3f)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5510] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5512] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5514] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5514] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5515] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5526] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,connection.autoconnect-priority" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5528] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5530] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5531] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5536] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5538] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5540] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5546] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5548] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 kernel: ovs-system: entered promiscuous mode
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5554] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5560] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5566] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5567] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 systemd-udevd[51732]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:08:41 compute-0 kernel: Timeout policy base is empty
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5571] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5576] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5580] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5582] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5587] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5591] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5594] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5596] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5600] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5604] dhcp4 (eth0): canceled DHCP transaction
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5604] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5604] dhcp4 (eth0): state changed no lease
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5605] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5614] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5617] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51728 uid=0 result="fail" reason="Device is not activated"
Dec 01 09:08:41 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5654] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5657] dhcp4 (eth0): state changed new lease, address=38.102.83.132
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5661] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5678] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 01 09:08:41 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5725] device (eth1): disconnecting for new activation request.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5726] audit: op="connection-activate" uuid="178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8" name="ci-private-network" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5727] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5732] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 01 09:08:41 compute-0 kernel: br-ex: entered promiscuous mode
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5850] device (eth1): Activation: starting connection 'ci-private-network' (178f6c72-5a8d-5d35-b5e8-6b22bb1c98c8)
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5881] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5887] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5899] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5901] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5901] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5903] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5905] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5909] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5912] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5914] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5918] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5932] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 kernel: vlan22: entered promiscuous mode
Dec 01 09:08:41 compute-0 systemd-udevd[51733]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5938] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5944] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5952] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5958] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5964] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5969] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5974] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5980] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5988] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.5994] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6000] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6004] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6010] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6019] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 01 09:08:41 compute-0 kernel: vlan21: entered promiscuous mode
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6027] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6056] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6060] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6071] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 kernel: vlan23: entered promiscuous mode
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6084] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6090] device (eth1): Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6132] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6139] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 kernel: vlan20: entered promiscuous mode
Dec 01 09:08:41 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6158] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:08:41 compute-0 systemd-udevd[51839]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6170] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6183] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6207] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6214] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6216] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6219] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6225] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6230] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6235] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6243] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6245] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6248] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6255] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6260] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6266] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6279] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6293] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6336] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6339] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:08:41 compute-0 NetworkManager[48954]: <info>  [1764580121.6346] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:08:42 compute-0 NetworkManager[48954]: <info>  [1764580122.7561] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec 01 09:08:42 compute-0 NetworkManager[48954]: <info>  [1764580122.9494] checkpoint[0x55bcd2323950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 01 09:08:42 compute-0 NetworkManager[48954]: <info>  [1764580122.9497] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51728 uid=0 result="success"
Dec 01 09:08:43 compute-0 sudo[52085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkukjyvzpseyevqyolzlsyuckhzcuhtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580122.579852-295-182170145995311/AnsiballZ_async_status.py'
Dec 01 09:08:43 compute-0 sudo[52085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.3528] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.3542] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec 01 09:08:43 compute-0 python3.9[52088]: ansible-ansible.legacy.async_status Invoked with jid=j688281423766.51722 mode=status _async_dir=/root/.ansible_async
Dec 01 09:08:43 compute-0 sudo[52085]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.5862] audit: op="networking-control" arg="global-dns-configuration" pid=51728 uid=0 result="success"
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.5887] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.5909] audit: op="networking-control" arg="global-dns-configuration" pid=51728 uid=0 result="success"
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.5929] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.7388] checkpoint[0x55bcd2323a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 01 09:08:43 compute-0 NetworkManager[48954]: <info>  [1764580123.7392] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51728 uid=0 result="success"
Dec 01 09:08:43 compute-0 ansible-async_wrapper.py[51726]: Module complete (51726)
Dec 01 09:08:44 compute-0 ansible-async_wrapper.py[51725]: Done in kid B.
Dec 01 09:08:46 compute-0 sudo[52190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqsrbrsfqtjwddtwzhkdbkmdkhbjjfgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580122.579852-295-182170145995311/AnsiballZ_async_status.py'
Dec 01 09:08:46 compute-0 sudo[52190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:46 compute-0 python3.9[52193]: ansible-ansible.legacy.async_status Invoked with jid=j688281423766.51722 mode=status _async_dir=/root/.ansible_async
Dec 01 09:08:46 compute-0 sudo[52190]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:47 compute-0 sudo[52290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsstepxdyacbxjokidzdlpwlfnmzpdlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580122.579852-295-182170145995311/AnsiballZ_async_status.py'
Dec 01 09:08:47 compute-0 sudo[52290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:47 compute-0 python3.9[52292]: ansible-ansible.legacy.async_status Invoked with jid=j688281423766.51722 mode=cleanup _async_dir=/root/.ansible_async
Dec 01 09:08:47 compute-0 sudo[52290]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:47 compute-0 sudo[52442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czfrtucilnqcdvfctwvorlikwhdzvrkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580127.5453684-322-114557561447449/AnsiballZ_stat.py'
Dec 01 09:08:47 compute-0 sudo[52442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:47 compute-0 python3.9[52444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:08:48 compute-0 sudo[52442]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:48 compute-0 sudo[52565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vigffkyqexbsgbmcexgsnmjeyzrlsalz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580127.5453684-322-114557561447449/AnsiballZ_copy.py'
Dec 01 09:08:48 compute-0 sudo[52565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:48 compute-0 python3.9[52567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580127.5453684-322-114557561447449/.source.returncode _original_basename=.ejppru9l follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:48 compute-0 sudo[52565]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:48 compute-0 sudo[52717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebcqhynjtpmlucpodwiwglrumxyiychl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580128.7046254-338-33903186203364/AnsiballZ_stat.py'
Dec 01 09:08:48 compute-0 sudo[52717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:49 compute-0 python3.9[52719]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:08:49 compute-0 sudo[52717]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:49 compute-0 sudo[52840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqczyzzpnjbvdotqiangumwclbtfyzmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580128.7046254-338-33903186203364/AnsiballZ_copy.py'
Dec 01 09:08:49 compute-0 sudo[52840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:49 compute-0 python3.9[52842]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580128.7046254-338-33903186203364/.source.cfg _original_basename=.jf5y7acg follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:08:49 compute-0 sudo[52840]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:50 compute-0 sudo[52993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skknnbbrvbclhtvvlotvpfeiqpczhryr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580129.8058836-353-138675146570713/AnsiballZ_systemd.py'
Dec 01 09:08:50 compute-0 sudo[52993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:08:50 compute-0 python3.9[52995]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:08:50 compute-0 systemd[1]: Reloading Network Manager...
Dec 01 09:08:50 compute-0 NetworkManager[48954]: <info>  [1764580130.3978] audit: op="reload" arg="0" pid=52999 uid=0 result="success"
Dec 01 09:08:50 compute-0 NetworkManager[48954]: <info>  [1764580130.3985] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 01 09:08:50 compute-0 systemd[1]: Reloaded Network Manager.
Dec 01 09:08:50 compute-0 sudo[52993]: pam_unix(sudo:session): session closed for user root
Dec 01 09:08:50 compute-0 sshd-session[44956]: Connection closed by 192.168.122.30 port 57622
Dec 01 09:08:50 compute-0 sshd-session[44953]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:08:50 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Dec 01 09:08:50 compute-0 systemd[1]: session-10.scope: Consumed 47.040s CPU time.
Dec 01 09:08:50 compute-0 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Dec 01 09:08:50 compute-0 systemd-logind[788]: Removed session 10.
Dec 01 09:08:52 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 09:08:56 compute-0 sshd-session[53032]: Accepted publickey for zuul from 192.168.122.30 port 53478 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:08:56 compute-0 systemd-logind[788]: New session 11 of user zuul.
Dec 01 09:08:56 compute-0 systemd[1]: Started Session 11 of User zuul.
Dec 01 09:08:56 compute-0 sshd-session[53032]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:08:57 compute-0 python3.9[53185]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:08:58 compute-0 python3.9[53339]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:08:59 compute-0 python3.9[53533]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:09:00 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:09:00 compute-0 sshd-session[53035]: Connection closed by 192.168.122.30 port 53478
Dec 01 09:09:00 compute-0 sshd-session[53032]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:09:00 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Dec 01 09:09:00 compute-0 systemd[1]: session-11.scope: Consumed 2.233s CPU time.
Dec 01 09:09:00 compute-0 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Dec 01 09:09:00 compute-0 systemd-logind[788]: Removed session 11.
Dec 01 09:09:05 compute-0 sshd-session[53562]: Accepted publickey for zuul from 192.168.122.30 port 44326 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:09:05 compute-0 systemd-logind[788]: New session 12 of user zuul.
Dec 01 09:09:05 compute-0 systemd[1]: Started Session 12 of User zuul.
Dec 01 09:09:05 compute-0 sshd-session[53562]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:09:06 compute-0 python3.9[53716]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:07 compute-0 python3.9[53870]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:08 compute-0 sudo[54024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vakirenjaxdgliukxmjuvfxbigyhhdzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580147.7642248-40-134319229267301/AnsiballZ_setup.py'
Dec 01 09:09:08 compute-0 sudo[54024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:08 compute-0 python3.9[54026]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:09:08 compute-0 sudo[54024]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:09 compute-0 sudo[54108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronpijfprahtnfzoqncfiwtdbujpyqhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580147.7642248-40-134319229267301/AnsiballZ_dnf.py'
Dec 01 09:09:09 compute-0 sudo[54108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:09 compute-0 python3.9[54110]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:09:10 compute-0 sudo[54108]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:10 compute-0 sudo[54262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybawjtaunaaqklyraramvwhvrcixoagn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580150.6807203-52-141763950518541/AnsiballZ_setup.py'
Dec 01 09:09:10 compute-0 sudo[54262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:11 compute-0 python3.9[54264]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:09:11 compute-0 sudo[54262]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:12 compute-0 sudo[54457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygtvchlrxrmbvndvvagqawwjxuylxltd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580151.7198498-63-77291000764185/AnsiballZ_file.py'
Dec 01 09:09:12 compute-0 sudo[54457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:12 compute-0 python3.9[54459]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:12 compute-0 sudo[54457]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:12 compute-0 sudo[54610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mehkapvaxfheoytafzkvsvyxopfifulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580152.4902728-71-6297049201712/AnsiballZ_command.py'
Dec 01 09:09:12 compute-0 sudo[54610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:13 compute-0 python3.9[54612]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:09:13 compute-0 podman[54613]: 2025-12-01 09:09:13.167727101 +0000 UTC m=+0.042176780 system refresh
Dec 01 09:09:13 compute-0 sudo[54610]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:13 compute-0 sudo[54774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvohusizienhvzofenevryvizsgqbzkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580153.327181-79-207350350659281/AnsiballZ_stat.py'
Dec 01 09:09:13 compute-0 sudo[54774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:13 compute-0 python3.9[54776]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:09:14 compute-0 sudo[54774]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:09:14 compute-0 sudo[54897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnepbzahaxopqlzftfunityqqthtrxby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580153.327181-79-207350350659281/AnsiballZ_copy.py'
Dec 01 09:09:14 compute-0 sudo[54897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:14 compute-0 python3.9[54899]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580153.327181-79-207350350659281/.source.json follow=False _original_basename=podman_network_config.j2 checksum=b092e8926a42991481e8661bbb2548b1c09df469 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:14 compute-0 sudo[54897]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:15 compute-0 sudo[55049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezzzglviyyckyfqaygnygjcgrjfiqnwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580154.8477156-94-166247171625002/AnsiballZ_stat.py'
Dec 01 09:09:15 compute-0 sudo[55049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:15 compute-0 python3.9[55051]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:09:15 compute-0 sudo[55049]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:15 compute-0 sudo[55172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldzwkwbagwkcbqzkkdnxjldvjsvssuii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580154.8477156-94-166247171625002/AnsiballZ_copy.py'
Dec 01 09:09:15 compute-0 sudo[55172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:15 compute-0 python3.9[55174]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580154.8477156-94-166247171625002/.source.conf follow=False _original_basename=registries.conf.j2 checksum=f95551851a3aad1fadf39ba40ad5808b10502fe1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:09:15 compute-0 sudo[55172]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:16 compute-0 sudo[55324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivsmifvygzzwyxqcwxsbaajejuxcmscn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580156.043222-110-110163029481684/AnsiballZ_ini_file.py'
Dec 01 09:09:16 compute-0 sudo[55324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:16 compute-0 python3.9[55326]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:09:16 compute-0 sudo[55324]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:16 compute-0 sudo[55476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxykbuijyquzvebwmyenececcyiokcjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580156.7515373-110-139002302920465/AnsiballZ_ini_file.py'
Dec 01 09:09:17 compute-0 sudo[55476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:17 compute-0 python3.9[55478]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:09:17 compute-0 sudo[55476]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:17 compute-0 sudo[55628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmoqpermoyzwlpqkjfgnzoaluykbdbrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580157.3236601-110-63694043717440/AnsiballZ_ini_file.py'
Dec 01 09:09:17 compute-0 sudo[55628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:17 compute-0 python3.9[55630]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:09:17 compute-0 sudo[55628]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:18 compute-0 sudo[55780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bitfnahdsgnqhtvespksquedsgczncuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580157.9668477-110-56052418611430/AnsiballZ_ini_file.py'
Dec 01 09:09:18 compute-0 sudo[55780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:18 compute-0 python3.9[55782]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:09:18 compute-0 sudo[55780]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:18 compute-0 sudo[55932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmpvdzftemqmovmzuqadozkbxnqlgddq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580158.6700838-141-227907837017104/AnsiballZ_dnf.py'
Dec 01 09:09:18 compute-0 sudo[55932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:19 compute-0 python3.9[55934]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:09:20 compute-0 sudo[55932]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:21 compute-0 sudo[56085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apmsblqbgsudccedtywzzgmejvhbqpmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580160.9353695-152-140850461260489/AnsiballZ_setup.py'
Dec 01 09:09:21 compute-0 sudo[56085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:21 compute-0 python3.9[56087]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:21 compute-0 sudo[56085]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:21 compute-0 sudo[56239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgvbdappdituhrsdyocyidzimiixtbbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580161.706433-160-157956843332483/AnsiballZ_stat.py'
Dec 01 09:09:21 compute-0 sudo[56239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:22 compute-0 python3.9[56241]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:09:22 compute-0 sudo[56239]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:22 compute-0 sudo[56391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztsckondevhykgmcjhvpqtwmvolydjgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580162.3259923-169-49233714996782/AnsiballZ_stat.py'
Dec 01 09:09:22 compute-0 sudo[56391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:22 compute-0 python3.9[56393]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:09:22 compute-0 sudo[56391]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:23 compute-0 sudo[56543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eohsfoicuzwyuqlxmytntvvfdpoembjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580163.028051-179-30983401334301/AnsiballZ_command.py'
Dec 01 09:09:23 compute-0 sudo[56543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:23 compute-0 python3.9[56545]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:09:23 compute-0 sudo[56543]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:24 compute-0 sudo[56696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsjjuweforqyasdmireycjqdifqswwkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580163.6843483-189-81199300387358/AnsiballZ_service_facts.py'
Dec 01 09:09:24 compute-0 sudo[56696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:24 compute-0 python3.9[56698]: ansible-service_facts Invoked
Dec 01 09:09:24 compute-0 network[56715]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:09:24 compute-0 network[56716]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:09:24 compute-0 network[56717]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:09:26 compute-0 sudo[56696]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:28 compute-0 sudo[57000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iooatzkwdribsostanipqzvftpvkwvzg ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764580167.814124-204-4790009947033/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764580167.814124-204-4790009947033/args'
Dec 01 09:09:28 compute-0 sudo[57000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:28 compute-0 sudo[57000]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:28 compute-0 sudo[57167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhqtsaaxbnkcxsovobomhwaopjfuoor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580168.3673484-215-268865589201487/AnsiballZ_dnf.py'
Dec 01 09:09:28 compute-0 sudo[57167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:28 compute-0 python3.9[57169]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:09:30 compute-0 sudo[57167]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:31 compute-0 sudo[57320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prdabggixduvqaagkymfmiwqxscsppes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580170.4342585-228-132606430596246/AnsiballZ_package_facts.py'
Dec 01 09:09:31 compute-0 sudo[57320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:31 compute-0 python3.9[57322]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 01 09:09:31 compute-0 sudo[57320]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:32 compute-0 sudo[57472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnwnciucvtrogrgmcmgvcvokoaqujics ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580172.1676557-238-90243860553674/AnsiballZ_stat.py'
Dec 01 09:09:32 compute-0 sudo[57472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:32 compute-0 python3.9[57474]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:09:32 compute-0 sudo[57472]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:33 compute-0 sudo[57597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndrdvrjavmpdrixlftmdjbceomnrscza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580172.1676557-238-90243860553674/AnsiballZ_copy.py'
Dec 01 09:09:33 compute-0 sudo[57597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:33 compute-0 python3.9[57599]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580172.1676557-238-90243860553674/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:33 compute-0 sudo[57597]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:33 compute-0 sudo[57751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohspjqurgaolivstouatfpttewyruxye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580173.536461-253-134057119867782/AnsiballZ_stat.py'
Dec 01 09:09:33 compute-0 sudo[57751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:33 compute-0 python3.9[57753]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:09:34 compute-0 sudo[57751]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:34 compute-0 sudo[57876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsebpqoycbryqghgzhrprqgbssrkgmnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580173.536461-253-134057119867782/AnsiballZ_copy.py'
Dec 01 09:09:34 compute-0 sudo[57876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:34 compute-0 python3.9[57878]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580173.536461-253-134057119867782/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:34 compute-0 sudo[57876]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:35 compute-0 sudo[58030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeepztjdhgfuuuqaombvxrdaidklifyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580174.9539473-274-165575001784688/AnsiballZ_lineinfile.py'
Dec 01 09:09:35 compute-0 sudo[58030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:35 compute-0 python3.9[58032]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:35 compute-0 sudo[58030]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:36 compute-0 sudo[58184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbjsjtardbvpmaeqtpeluhjuxbunjvyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580176.2086308-289-116498324639466/AnsiballZ_setup.py'
Dec 01 09:09:36 compute-0 sudo[58184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:36 compute-0 python3.9[58186]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:09:36 compute-0 sudo[58184]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:37 compute-0 sudo[58268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwjwbxhhmwivvlzxyaxnilflolprbjme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580176.2086308-289-116498324639466/AnsiballZ_systemd.py'
Dec 01 09:09:37 compute-0 sudo[58268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:37 compute-0 python3.9[58270]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:09:37 compute-0 sudo[58268]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:38 compute-0 sudo[58422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgbhvwklyowqqfvfqitjvwkjmdrdiwjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580178.5274808-305-32367441087200/AnsiballZ_setup.py'
Dec 01 09:09:38 compute-0 sudo[58422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:39 compute-0 python3.9[58424]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:09:39 compute-0 sudo[58422]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:39 compute-0 sudo[58506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxkjxjvbnlbwigihenhpobnofnceulvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580178.5274808-305-32367441087200/AnsiballZ_systemd.py'
Dec 01 09:09:39 compute-0 sudo[58506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:39 compute-0 python3.9[58508]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:09:39 compute-0 chronyd[791]: chronyd exiting
Dec 01 09:09:39 compute-0 systemd[1]: Stopping NTP client/server...
Dec 01 09:09:39 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Dec 01 09:09:39 compute-0 systemd[1]: Stopped NTP client/server.
Dec 01 09:09:39 compute-0 systemd[1]: Starting NTP client/server...
Dec 01 09:09:39 compute-0 chronyd[58516]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 01 09:09:39 compute-0 chronyd[58516]: Frequency -28.434 +/- 0.269 ppm read from /var/lib/chrony/drift
Dec 01 09:09:39 compute-0 chronyd[58516]: Loaded seccomp filter (level 2)
Dec 01 09:09:39 compute-0 systemd[1]: Started NTP client/server.
Dec 01 09:09:39 compute-0 sudo[58506]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:40 compute-0 sshd-session[53565]: Connection closed by 192.168.122.30 port 44326
Dec 01 09:09:40 compute-0 sshd-session[53562]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:09:40 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Dec 01 09:09:40 compute-0 systemd[1]: session-12.scope: Consumed 24.157s CPU time.
Dec 01 09:09:40 compute-0 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Dec 01 09:09:40 compute-0 systemd-logind[788]: Removed session 12.
Dec 01 09:09:46 compute-0 sshd-session[58542]: Accepted publickey for zuul from 192.168.122.30 port 59686 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:09:46 compute-0 systemd-logind[788]: New session 13 of user zuul.
Dec 01 09:09:46 compute-0 systemd[1]: Started Session 13 of User zuul.
Dec 01 09:09:46 compute-0 sshd-session[58542]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:09:46 compute-0 sudo[58695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mikiahtxotiryudtjkmrcshaygoknbcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580186.3438125-22-196127389191704/AnsiballZ_file.py'
Dec 01 09:09:46 compute-0 sudo[58695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:46 compute-0 python3.9[58697]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:47 compute-0 sudo[58695]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:47 compute-0 sudo[58847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpzczeaduvrtyohctzqolaamrtjkwbyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580187.228003-34-274994467968404/AnsiballZ_stat.py'
Dec 01 09:09:47 compute-0 sudo[58847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:47 compute-0 python3.9[58849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:09:47 compute-0 sudo[58847]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:48 compute-0 sudo[58970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uukjnivflhixweybwkaxujogozzqcycn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580187.228003-34-274994467968404/AnsiballZ_copy.py'
Dec 01 09:09:48 compute-0 sudo[58970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:48 compute-0 python3.9[58972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580187.228003-34-274994467968404/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:48 compute-0 sudo[58970]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:48 compute-0 sshd-session[58545]: Connection closed by 192.168.122.30 port 59686
Dec 01 09:09:48 compute-0 sshd-session[58542]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:09:48 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Dec 01 09:09:48 compute-0 systemd[1]: session-13.scope: Consumed 1.402s CPU time.
Dec 01 09:09:48 compute-0 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Dec 01 09:09:48 compute-0 systemd-logind[788]: Removed session 13.
Dec 01 09:09:54 compute-0 sshd-session[58997]: Accepted publickey for zuul from 192.168.122.30 port 34672 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:09:54 compute-0 systemd-logind[788]: New session 14 of user zuul.
Dec 01 09:09:54 compute-0 systemd[1]: Started Session 14 of User zuul.
Dec 01 09:09:54 compute-0 sshd-session[58997]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:09:55 compute-0 python3.9[59150]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:56 compute-0 sudo[59304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shgrzgopabulwukmkolzalhvblfwyskb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580196.1927621-33-227108617971325/AnsiballZ_file.py'
Dec 01 09:09:56 compute-0 sudo[59304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:56 compute-0 python3.9[59306]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:56 compute-0 sudo[59304]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:57 compute-0 sudo[59479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxuccwfuegjeewacwwifixyckcbaxrpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580197.0280848-41-105053849947433/AnsiballZ_stat.py'
Dec 01 09:09:57 compute-0 sudo[59479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:57 compute-0 python3.9[59481]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:09:57 compute-0 sudo[59479]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:58 compute-0 sudo[59602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grxoggnfmccpgivnefmzpjvzmcavejvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580197.0280848-41-105053849947433/AnsiballZ_copy.py'
Dec 01 09:09:58 compute-0 sudo[59602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:58 compute-0 python3.9[59604]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764580197.0280848-41-105053849947433/.source.json _original_basename=.lxm6qfx3 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:58 compute-0 sudo[59602]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:58 compute-0 sudo[59754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hucwxmpqbzyfdbosofdlsothsoahyumw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580198.6725807-64-24533966855836/AnsiballZ_stat.py'
Dec 01 09:09:58 compute-0 sudo[59754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:59 compute-0 python3.9[59756]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:09:59 compute-0 sudo[59754]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:59 compute-0 sudo[59877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glmayiodownjfhgrzdwiccvjniprpjwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580198.6725807-64-24533966855836/AnsiballZ_copy.py'
Dec 01 09:09:59 compute-0 sudo[59877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:59 compute-0 python3.9[59879]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580198.6725807-64-24533966855836/.source _original_basename=.8r9lzn_l follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:59 compute-0 sudo[59877]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:00 compute-0 sudo[60029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqczjkslhjdaosmlkdaoxadarhperfqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580199.7512128-80-119439076266284/AnsiballZ_file.py'
Dec 01 09:10:00 compute-0 sudo[60029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:00 compute-0 python3.9[60031]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:10:00 compute-0 sudo[60029]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:00 compute-0 sudo[60181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mloedqxsfcspubplodyejcynkemflmmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580200.3390658-88-258182885914952/AnsiballZ_stat.py'
Dec 01 09:10:00 compute-0 sudo[60181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:00 compute-0 python3.9[60183]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:00 compute-0 sudo[60181]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:01 compute-0 sudo[60304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqaztezicbwxkhkrxqevfutityhfrgol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580200.3390658-88-258182885914952/AnsiballZ_copy.py'
Dec 01 09:10:01 compute-0 sudo[60304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:01 compute-0 python3.9[60306]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580200.3390658-88-258182885914952/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:10:01 compute-0 sudo[60304]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:01 compute-0 sudo[60456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elcmutafmklbbtbcqhdzajpkvovlelpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580201.429198-88-118041360139777/AnsiballZ_stat.py'
Dec 01 09:10:01 compute-0 sudo[60456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:01 compute-0 python3.9[60458]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:01 compute-0 sudo[60456]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:02 compute-0 sudo[60579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhsiaeoyhpaymhhooxevxbbtzsiytaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580201.429198-88-118041360139777/AnsiballZ_copy.py'
Dec 01 09:10:02 compute-0 sudo[60579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:02 compute-0 python3.9[60581]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580201.429198-88-118041360139777/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:10:02 compute-0 sudo[60579]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:02 compute-0 sudo[60731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cemvojztopnldfunawzctqwvxnksnplt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580202.5108-117-56738278703198/AnsiballZ_file.py'
Dec 01 09:10:02 compute-0 sudo[60731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:02 compute-0 python3.9[60733]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:02 compute-0 sudo[60731]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:03 compute-0 sudo[60883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavdmeupckgqisobnfufdwaqnvhfgjwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580203.1297405-125-161627068528125/AnsiballZ_stat.py'
Dec 01 09:10:03 compute-0 sudo[60883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:03 compute-0 python3.9[60885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:03 compute-0 sudo[60883]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:03 compute-0 sudo[61006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joflrfgtfmtofltdhzfegqvavgwvuezo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580203.1297405-125-161627068528125/AnsiballZ_copy.py'
Dec 01 09:10:03 compute-0 sudo[61006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:04 compute-0 python3.9[61008]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580203.1297405-125-161627068528125/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:04 compute-0 sudo[61006]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:04 compute-0 sudo[61158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsehnafcolvugeussztvpcxmtoxrjcto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580204.268697-140-234548500461708/AnsiballZ_stat.py'
Dec 01 09:10:04 compute-0 sudo[61158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:04 compute-0 python3.9[61160]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:04 compute-0 sudo[61158]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:05 compute-0 sudo[61281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbjqxnwwqexyapzqdkjjpmibqnqymqnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580204.268697-140-234548500461708/AnsiballZ_copy.py'
Dec 01 09:10:05 compute-0 sudo[61281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:05 compute-0 python3.9[61283]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580204.268697-140-234548500461708/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:05 compute-0 sudo[61281]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:06 compute-0 sudo[61433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmtoaigljyipluvaigsbizgzazjpnalq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580205.6158936-155-204812125266667/AnsiballZ_systemd.py'
Dec 01 09:10:06 compute-0 sudo[61433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:06 compute-0 python3.9[61435]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:10:06 compute-0 systemd[1]: Reloading.
Dec 01 09:10:06 compute-0 systemd-sysv-generator[61465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:10:06 compute-0 systemd-rc-local-generator[61462]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:10:06 compute-0 systemd[1]: Reloading.
Dec 01 09:10:06 compute-0 systemd-rc-local-generator[61500]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:10:06 compute-0 systemd-sysv-generator[61503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:10:06 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Dec 01 09:10:06 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Dec 01 09:10:06 compute-0 sudo[61433]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:07 compute-0 sudo[61660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkfgvgdhwwauttlisgqwhyfvghvflkvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580207.1460686-163-213283092680509/AnsiballZ_stat.py'
Dec 01 09:10:07 compute-0 sudo[61660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:07 compute-0 python3.9[61662]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:07 compute-0 sudo[61660]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:07 compute-0 sudo[61783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cismowleybqvatbmcreiapwvcqyenjdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580207.1460686-163-213283092680509/AnsiballZ_copy.py'
Dec 01 09:10:07 compute-0 sudo[61783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:08 compute-0 python3.9[61785]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580207.1460686-163-213283092680509/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:08 compute-0 sudo[61783]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:08 compute-0 sudo[61935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhjmrkezrftvfnyubpwazyixukkcfely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580208.2299519-178-61878134684921/AnsiballZ_stat.py'
Dec 01 09:10:08 compute-0 sudo[61935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:08 compute-0 python3.9[61937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:08 compute-0 sudo[61935]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:08 compute-0 sudo[62058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xitnshvyusdquvhfcextttmmuphveuum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580208.2299519-178-61878134684921/AnsiballZ_copy.py'
Dec 01 09:10:08 compute-0 sudo[62058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:09 compute-0 python3.9[62060]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580208.2299519-178-61878134684921/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:09 compute-0 sudo[62058]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:09 compute-0 sudo[62210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlifecvpgygiealehhrlnwvinffeqzex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580209.3336167-193-68846580023185/AnsiballZ_systemd.py'
Dec 01 09:10:09 compute-0 sudo[62210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:09 compute-0 python3.9[62212]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:10:09 compute-0 systemd[1]: Reloading.
Dec 01 09:10:09 compute-0 systemd-sysv-generator[62244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:10:09 compute-0 systemd-rc-local-generator[62240]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:10:10 compute-0 systemd[1]: Reloading.
Dec 01 09:10:10 compute-0 systemd-rc-local-generator[62278]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:10:10 compute-0 systemd-sysv-generator[62281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:10:10 compute-0 systemd[1]: Starting Create netns directory...
Dec 01 09:10:10 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:10:10 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:10:10 compute-0 systemd[1]: Finished Create netns directory.
Dec 01 09:10:10 compute-0 sudo[62210]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:11 compute-0 python3.9[62439]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:10:11 compute-0 network[62456]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:10:11 compute-0 network[62457]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:10:11 compute-0 network[62458]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:10:15 compute-0 sudo[62718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfwqixwqhhcgixdbjztpxodprtooudon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580214.8715093-209-94817448755943/AnsiballZ_systemd.py'
Dec 01 09:10:15 compute-0 sudo[62718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:15 compute-0 python3.9[62720]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:10:15 compute-0 systemd[1]: Reloading.
Dec 01 09:10:15 compute-0 systemd-rc-local-generator[62751]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:10:15 compute-0 systemd-sysv-generator[62755]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:10:15 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 01 09:10:15 compute-0 iptables.init[62761]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 01 09:10:15 compute-0 iptables.init[62761]: iptables: Flushing firewall rules: [  OK  ]
Dec 01 09:10:15 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Dec 01 09:10:15 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 01 09:10:16 compute-0 sudo[62718]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:16 compute-0 sudo[62955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jewsonomgvsxeohjedtsvajkprfqdnwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580216.1458132-209-63656167334306/AnsiballZ_systemd.py'
Dec 01 09:10:16 compute-0 sudo[62955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:16 compute-0 python3.9[62957]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:10:16 compute-0 sudo[62955]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:17 compute-0 sudo[63109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itpyyzmdgnpndzoidsgqokglqatvxdgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580216.9461644-225-56942632696856/AnsiballZ_systemd.py'
Dec 01 09:10:17 compute-0 sudo[63109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:17 compute-0 python3.9[63111]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:10:17 compute-0 systemd[1]: Reloading.
Dec 01 09:10:17 compute-0 systemd-rc-local-generator[63138]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:10:17 compute-0 systemd-sysv-generator[63144]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:10:17 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 01 09:10:17 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 01 09:10:17 compute-0 sudo[63109]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:18 compute-0 sudo[63301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwbyejgujrackquukpcfpqsoanqqysnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580217.941182-233-118062329389772/AnsiballZ_command.py'
Dec 01 09:10:18 compute-0 sudo[63301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:18 compute-0 python3.9[63303]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:18 compute-0 sudo[63301]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:19 compute-0 sudo[63454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drwvzxgutdxtjboaumtilkqyoceadlwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580218.9172382-247-109808846649777/AnsiballZ_stat.py'
Dec 01 09:10:19 compute-0 sudo[63454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:19 compute-0 python3.9[63456]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:19 compute-0 sudo[63454]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:19 compute-0 sudo[63579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bubbodudzccfzeqnhrwsebaxpqjqdfjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580218.9172382-247-109808846649777/AnsiballZ_copy.py'
Dec 01 09:10:19 compute-0 sudo[63579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:19 compute-0 python3.9[63581]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580218.9172382-247-109808846649777/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:20 compute-0 sudo[63579]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:20 compute-0 sudo[63732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpwahhyqjaqooouayxyflcegklqxtjyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580220.1721067-262-173351146622773/AnsiballZ_systemd.py'
Dec 01 09:10:20 compute-0 sudo[63732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:20 compute-0 python3.9[63734]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:10:20 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Dec 01 09:10:20 compute-0 sshd[1008]: Received SIGHUP; restarting.
Dec 01 09:10:20 compute-0 sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 01 09:10:20 compute-0 sshd[1008]: Server listening on :: port 22.
Dec 01 09:10:20 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Dec 01 09:10:20 compute-0 sudo[63732]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:21 compute-0 sudo[63888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pclmqtbvzxycfdkqptelzwimpdufkjlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580221.0028956-270-187558873989579/AnsiballZ_file.py'
Dec 01 09:10:21 compute-0 sudo[63888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:21 compute-0 python3.9[63890]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:21 compute-0 sudo[63888]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:21 compute-0 sudo[64040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvjrtuijwqmpzbmfejaynmxtezdvjqfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580221.5954757-278-265994370727437/AnsiballZ_stat.py'
Dec 01 09:10:21 compute-0 sudo[64040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:22 compute-0 python3.9[64042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:22 compute-0 sudo[64040]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:22 compute-0 sudo[64163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgjaskhawovccoqarxqmtujxfaemysyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580221.5954757-278-265994370727437/AnsiballZ_copy.py'
Dec 01 09:10:22 compute-0 sudo[64163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:22 compute-0 python3.9[64165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580221.5954757-278-265994370727437/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:22 compute-0 sudo[64163]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:23 compute-0 sudo[64315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hopnagggdnxqkpygbweexojdnuzrubey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580222.8568664-296-228470389399002/AnsiballZ_timezone.py'
Dec 01 09:10:23 compute-0 sudo[64315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:23 compute-0 python3.9[64317]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 01 09:10:23 compute-0 systemd[1]: Starting Time & Date Service...
Dec 01 09:10:23 compute-0 systemd[1]: Started Time & Date Service.
Dec 01 09:10:23 compute-0 sudo[64315]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:24 compute-0 sudo[64471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqylijmeuzxwgqnhuaafhtvcliccgflv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580223.856793-305-199838466395332/AnsiballZ_file.py'
Dec 01 09:10:24 compute-0 sudo[64471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:24 compute-0 python3.9[64473]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:24 compute-0 sudo[64471]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:24 compute-0 sudo[64623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqbqxqhegsvxwnlnzvyzvyiknfueivmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580224.6005318-313-169543447719693/AnsiballZ_stat.py'
Dec 01 09:10:24 compute-0 sudo[64623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:25 compute-0 python3.9[64625]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:25 compute-0 sudo[64623]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:25 compute-0 sudo[64746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edibuuzgfqhjnkamulqhzliprplkclyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580224.6005318-313-169543447719693/AnsiballZ_copy.py'
Dec 01 09:10:25 compute-0 sudo[64746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:25 compute-0 python3.9[64748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580224.6005318-313-169543447719693/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:25 compute-0 sudo[64746]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:26 compute-0 sudo[64898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwuksomfjhvwvvncsevexwwdmgqqupny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580225.8162234-328-265101831653663/AnsiballZ_stat.py'
Dec 01 09:10:26 compute-0 sudo[64898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:26 compute-0 python3.9[64900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:26 compute-0 sudo[64898]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:26 compute-0 sudo[65021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cepsohejwsiizzmuriivtldrxhjgqqyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580225.8162234-328-265101831653663/AnsiballZ_copy.py'
Dec 01 09:10:26 compute-0 sudo[65021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:26 compute-0 python3.9[65023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580225.8162234-328-265101831653663/.source.yaml _original_basename=.12qrcw_2 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:26 compute-0 sudo[65021]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:27 compute-0 sudo[65173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snulwoewaswddhzieujdzntgmtfirwoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580227.00589-343-144813562374855/AnsiballZ_stat.py'
Dec 01 09:10:27 compute-0 sudo[65173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:27 compute-0 python3.9[65175]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:27 compute-0 sudo[65173]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:27 compute-0 sudo[65296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dljvekcbrhngshitmhvtsrslsyzzmznu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580227.00589-343-144813562374855/AnsiballZ_copy.py'
Dec 01 09:10:27 compute-0 sudo[65296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:27 compute-0 python3.9[65298]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580227.00589-343-144813562374855/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:27 compute-0 sudo[65296]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:28 compute-0 sudo[65448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpelkwjjjdlrfhzezagncjxwgnkgaifi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580228.4324503-358-125636054192418/AnsiballZ_command.py'
Dec 01 09:10:28 compute-0 sudo[65448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:28 compute-0 python3.9[65450]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:28 compute-0 sudo[65448]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:29 compute-0 sudo[65601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbmcxzalglukzoboveywwdhgdapwkzlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580229.0279973-366-248081321825637/AnsiballZ_command.py'
Dec 01 09:10:29 compute-0 sudo[65601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:29 compute-0 python3.9[65603]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:29 compute-0 sudo[65601]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:30 compute-0 sudo[65754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfpcfndbvpfuvkmimziliuxinhulrrhb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764580229.7096148-374-158905687613198/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 09:10:30 compute-0 sudo[65754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:30 compute-0 python3[65756]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 09:10:30 compute-0 sudo[65754]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:30 compute-0 sudo[65906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlmsrnwujjhhblnajlftxkqboelxctjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580230.560174-382-238634775695896/AnsiballZ_stat.py'
Dec 01 09:10:30 compute-0 sudo[65906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:31 compute-0 python3.9[65908]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:31 compute-0 sudo[65906]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:31 compute-0 sudo[66029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuvntqfhnooramdiysrxlfinbyuqpguq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580230.560174-382-238634775695896/AnsiballZ_copy.py'
Dec 01 09:10:31 compute-0 sudo[66029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:31 compute-0 python3.9[66031]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580230.560174-382-238634775695896/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:31 compute-0 sudo[66029]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:32 compute-0 sudo[66181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugfzwaxsgbmngqdolqdbperqqhurdhtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580231.7545607-397-158485207755866/AnsiballZ_stat.py'
Dec 01 09:10:32 compute-0 sudo[66181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:32 compute-0 python3.9[66183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:32 compute-0 sudo[66181]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:32 compute-0 sudo[66304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouyytpkihmwndsblpjtqknlfmhukhqip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580231.7545607-397-158485207755866/AnsiballZ_copy.py'
Dec 01 09:10:32 compute-0 sudo[66304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:32 compute-0 python3.9[66306]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580231.7545607-397-158485207755866/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:32 compute-0 sudo[66304]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:33 compute-0 sudo[66456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytiwslofgvsypwthoyvumchbdmpvmlag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580232.8348715-412-119528364752335/AnsiballZ_stat.py'
Dec 01 09:10:33 compute-0 sudo[66456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:33 compute-0 python3.9[66458]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:33 compute-0 sudo[66456]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:33 compute-0 sudo[66579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efuqxaeqdsfbofpvmuyzgpjnyjghmqwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580232.8348715-412-119528364752335/AnsiballZ_copy.py'
Dec 01 09:10:33 compute-0 sudo[66579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:33 compute-0 python3.9[66581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580232.8348715-412-119528364752335/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:33 compute-0 sudo[66579]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:34 compute-0 sudo[66731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aomlggafuyoazsaohvhbvaktbhyqzmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580233.9697518-427-182805217374537/AnsiballZ_stat.py'
Dec 01 09:10:34 compute-0 sudo[66731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:34 compute-0 python3.9[66733]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:34 compute-0 sudo[66731]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:34 compute-0 sudo[66854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppkfldukydpwdvnhqvttolrpjrrowybq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580233.9697518-427-182805217374537/AnsiballZ_copy.py'
Dec 01 09:10:34 compute-0 sudo[66854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:35 compute-0 python3.9[66856]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580233.9697518-427-182805217374537/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:35 compute-0 sudo[66854]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:35 compute-0 sudo[67006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhumvezmeqvpmqkkjywirgvytdfziqgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580235.2278483-442-242492941712870/AnsiballZ_stat.py'
Dec 01 09:10:35 compute-0 sudo[67006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:35 compute-0 python3.9[67008]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:10:35 compute-0 sudo[67006]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:36 compute-0 sudo[67129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzjkdjapdvanxrubpqienhbeveaagptj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580235.2278483-442-242492941712870/AnsiballZ_copy.py'
Dec 01 09:10:36 compute-0 sudo[67129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:36 compute-0 python3.9[67131]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580235.2278483-442-242492941712870/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:36 compute-0 sudo[67129]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:36 compute-0 sudo[67281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jumjgvmwtdkizxldogxycsknamkgwryv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580236.3682158-457-95738976003112/AnsiballZ_file.py'
Dec 01 09:10:36 compute-0 sudo[67281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:36 compute-0 python3.9[67283]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:36 compute-0 sudo[67281]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:37 compute-0 sudo[67433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzhbslgvcebhbamtpwlevzcsgnmsszsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580237.0054214-465-145319153613364/AnsiballZ_command.py'
Dec 01 09:10:37 compute-0 sudo[67433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:37 compute-0 python3.9[67435]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:37 compute-0 sudo[67433]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:38 compute-0 sudo[67592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxsxhodenbummonenieccdzknloxkrtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580237.7832983-473-262479925163595/AnsiballZ_blockinfile.py'
Dec 01 09:10:38 compute-0 sudo[67592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:38 compute-0 python3.9[67594]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:38 compute-0 sudo[67592]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:39 compute-0 sudo[67745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odljmikawqthtcqvqwkvtndjashnwskz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580238.7929072-482-252296543930359/AnsiballZ_file.py'
Dec 01 09:10:39 compute-0 sudo[67745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:39 compute-0 python3.9[67747]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:39 compute-0 sudo[67745]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:39 compute-0 sudo[67897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeyohcezjuamrzogktyetbrbbqwronbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580239.5867207-482-72550994500219/AnsiballZ_file.py'
Dec 01 09:10:39 compute-0 sudo[67897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:40 compute-0 python3.9[67899]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:40 compute-0 sudo[67897]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:40 compute-0 sudo[68049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqyclmswvflukkobwbyqwxgrymuravfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580240.1982949-497-189280991236769/AnsiballZ_mount.py'
Dec 01 09:10:40 compute-0 sudo[68049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:40 compute-0 python3.9[68051]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:10:40 compute-0 sudo[68049]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:41 compute-0 sudo[68202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxcdraablpezshqyexzwhpfqdieqkfen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580241.09104-497-171036252869917/AnsiballZ_mount.py'
Dec 01 09:10:41 compute-0 sudo[68202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:41 compute-0 python3.9[68204]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:10:41 compute-0 sudo[68202]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:42 compute-0 sshd-session[59000]: Connection closed by 192.168.122.30 port 34672
Dec 01 09:10:42 compute-0 sshd-session[58997]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:10:42 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Dec 01 09:10:42 compute-0 systemd[1]: session-14.scope: Consumed 31.997s CPU time.
Dec 01 09:10:42 compute-0 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Dec 01 09:10:42 compute-0 systemd-logind[788]: Removed session 14.
Dec 01 09:10:47 compute-0 sshd-session[68230]: Accepted publickey for zuul from 192.168.122.30 port 34410 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:10:47 compute-0 systemd-logind[788]: New session 15 of user zuul.
Dec 01 09:10:47 compute-0 systemd[1]: Started Session 15 of User zuul.
Dec 01 09:10:47 compute-0 sshd-session[68230]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:10:48 compute-0 sudo[68383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxvnqchioqkfjwnzzbkmoxtbxpmyhoyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580247.9966815-16-99705650507401/AnsiballZ_tempfile.py'
Dec 01 09:10:48 compute-0 sudo[68383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:48 compute-0 python3.9[68385]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 01 09:10:48 compute-0 sudo[68383]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:49 compute-0 sudo[68535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jagdgtbgsdlgsthjeubtcqsuncnfdjcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580248.792347-28-234047439594073/AnsiballZ_stat.py'
Dec 01 09:10:49 compute-0 sudo[68535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:49 compute-0 python3.9[68537]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:10:49 compute-0 sudo[68535]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:50 compute-0 sudo[68687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alwynfgkjmapqhoqyuljzouxbdcapkpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580249.605424-38-40329378254699/AnsiballZ_setup.py'
Dec 01 09:10:50 compute-0 sudo[68687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:50 compute-0 python3.9[68689]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:10:50 compute-0 sudo[68687]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:51 compute-0 sudo[68839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehmccmmziqknihappmjfhdhszgoykfij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580250.6521487-47-193843570507917/AnsiballZ_blockinfile.py'
Dec 01 09:10:51 compute-0 sudo[68839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:51 compute-0 python3.9[68841]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRTxmAPcz2eFUCrQOAknLp4ibCvALuiJ7iA+ICPT8Mpd8XYcXDdZBZjlSgWd0U+d6qvFNYaJ4Kq/cNnxeSVMCkpQCGri3TTRfaS9L5COiCf0cmBNheHZSQL0uZLjKzjeaIyGWH6HdOA7KUsCK2YT/Iyf0OJzrBs5vhWuzbSXsCjsHTSzR+XxRX3C/ImHAtccLwxysUhm6H4CGIPn0bY/YGgoRkJUvouHT/4kSxhQrtFAKJOWlJ01d3tdISKrGa+SiKU6zq4yCgT5yeSsMSRyP+L06UuH7Htv2BSPXmTFLy8alJrAKLo19SllAr6m5ZP3OWy9eRDvp+oa4ZA3J9JX+isLwhjDkF1Q+aes+99JQ6E7W5hL8qvDAHCwaKgIo1IRMHJEVvZNsKqn+ME9EBDD1WyTNzik/qEOj2Cr9TXxmps8zD0VcngBAhdAv39R6EAPnVfRf1Goyagp6gPsCOeulh58jgrvAZ7L89u1J5yZY4C2Cu9js9UJwp46pdgU5qDDM=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILCzsFh+ZK0hqueDU2gWvb+j6m7hD/RYc8+thzHnJPmj
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN8lUi9ZvyyCZ7KdPvA7WBYtjDR8VhQzZuiukEvvpoRp0UJKIzVf11cXzP5sRkLnexUeWiXTv+jZK8hoAN9Othc=
                                             create=True mode=0644 path=/tmp/ansible.p_slgdzj state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:51 compute-0 sudo[68839]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:51 compute-0 sudo[68991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbbgfwotjhbmmtdxuoltrirzlbdaodhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580251.4161696-55-133389163868952/AnsiballZ_command.py'
Dec 01 09:10:51 compute-0 sudo[68991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:52 compute-0 python3.9[68993]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.p_slgdzj' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:52 compute-0 sudo[68991]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:52 compute-0 sudo[69145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgmtkfextvhlhktkovxpdijifhatxrrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580252.2141447-63-70646362248189/AnsiballZ_file.py'
Dec 01 09:10:52 compute-0 sudo[69145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:52 compute-0 python3.9[69147]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.p_slgdzj state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:52 compute-0 sudo[69145]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:53 compute-0 sshd-session[68233]: Connection closed by 192.168.122.30 port 34410
Dec 01 09:10:53 compute-0 sshd-session[68230]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:10:53 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Dec 01 09:10:53 compute-0 systemd[1]: session-15.scope: Consumed 3.015s CPU time.
Dec 01 09:10:53 compute-0 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Dec 01 09:10:53 compute-0 systemd-logind[788]: Removed session 15.
Dec 01 09:10:53 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 09:11:00 compute-0 sshd-session[69174]: Accepted publickey for zuul from 192.168.122.30 port 60864 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:11:00 compute-0 systemd-logind[788]: New session 16 of user zuul.
Dec 01 09:11:00 compute-0 systemd[1]: Started Session 16 of User zuul.
Dec 01 09:11:00 compute-0 sshd-session[69174]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:11:01 compute-0 python3.9[69327]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:11:01 compute-0 anacron[30901]: Job `cron.daily' started
Dec 01 09:11:01 compute-0 anacron[30901]: Job `cron.daily' terminated
Dec 01 09:11:02 compute-0 sudo[69483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-macqxkozvfnauyvdjljzfrdpogeisktw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580261.6210706-32-79269693343286/AnsiballZ_systemd.py'
Dec 01 09:11:02 compute-0 sudo[69483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:02 compute-0 python3.9[69485]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 01 09:11:02 compute-0 sudo[69483]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:03 compute-0 sudo[69637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkdiaojjhvzxrhbikouqyijfdzmhaxnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580262.8333652-40-232840609284841/AnsiballZ_systemd.py'
Dec 01 09:11:03 compute-0 sudo[69637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:03 compute-0 python3.9[69639]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:11:03 compute-0 sudo[69637]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:04 compute-0 sudo[69790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovpwcviohaukomobzdxbjhskqivaacxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580263.60874-49-25840234086191/AnsiballZ_command.py'
Dec 01 09:11:04 compute-0 sudo[69790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:04 compute-0 python3.9[69792]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:04 compute-0 sudo[69790]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:04 compute-0 sudo[69943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pspyngytqzwvclxaaxakcoqassqrpxke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580264.4602597-57-189269769898121/AnsiballZ_stat.py'
Dec 01 09:11:04 compute-0 sudo[69943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:05 compute-0 python3.9[69945]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:11:05 compute-0 sudo[69943]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:05 compute-0 sudo[70097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oklbxlllpifyeoqsobidwyewapobogfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580265.312-65-262195593046581/AnsiballZ_command.py'
Dec 01 09:11:05 compute-0 sudo[70097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:05 compute-0 python3.9[70099]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:05 compute-0 sudo[70097]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:06 compute-0 sudo[70252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kffaouleageuocxsynairscgzfhtyhgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580265.9027512-73-13808206834602/AnsiballZ_file.py'
Dec 01 09:11:06 compute-0 sudo[70252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:06 compute-0 python3.9[70254]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:06 compute-0 sudo[70252]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:06 compute-0 sshd-session[69177]: Connection closed by 192.168.122.30 port 60864
Dec 01 09:11:06 compute-0 sshd-session[69174]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:11:06 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Dec 01 09:11:06 compute-0 systemd[1]: session-16.scope: Consumed 4.105s CPU time.
Dec 01 09:11:06 compute-0 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Dec 01 09:11:06 compute-0 systemd-logind[788]: Removed session 16.
Dec 01 09:11:12 compute-0 sshd-session[70279]: Accepted publickey for zuul from 192.168.122.30 port 49698 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:11:12 compute-0 systemd-logind[788]: New session 17 of user zuul.
Dec 01 09:11:12 compute-0 systemd[1]: Started Session 17 of User zuul.
Dec 01 09:11:12 compute-0 sshd-session[70279]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:11:13 compute-0 python3.9[70432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:11:14 compute-0 sudo[70586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpdhulrhigohennkglohqkayfvflgtql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580273.7296486-34-167843566780343/AnsiballZ_setup.py'
Dec 01 09:11:14 compute-0 sudo[70586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:14 compute-0 python3.9[70588]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:11:14 compute-0 sudo[70586]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:14 compute-0 sudo[70670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dznzizyrpvadlvyusatzuxmpglqqlhii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580273.7296486-34-167843566780343/AnsiballZ_dnf.py'
Dec 01 09:11:14 compute-0 sudo[70670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:15 compute-0 python3.9[70672]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:11:16 compute-0 sudo[70670]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:17 compute-0 python3.9[70823]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:18 compute-0 python3.9[70974]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:11:19 compute-0 python3.9[71124]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:11:19 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:11:19 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:11:19 compute-0 python3.9[71275]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:11:20 compute-0 sshd-session[70282]: Connection closed by 192.168.122.30 port 49698
Dec 01 09:11:20 compute-0 sshd-session[70279]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:11:20 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Dec 01 09:11:20 compute-0 systemd[1]: session-17.scope: Consumed 5.523s CPU time.
Dec 01 09:11:20 compute-0 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Dec 01 09:11:20 compute-0 systemd-logind[788]: Removed session 17.
Dec 01 09:11:27 compute-0 sshd-session[71300]: Accepted publickey for zuul from 38.102.83.177 port 55786 ssh2: RSA SHA256:wvbSSRHhGqscdLbt7uF108h9jRKS/yXgXkyPU84jbuE
Dec 01 09:11:27 compute-0 systemd-logind[788]: New session 18 of user zuul.
Dec 01 09:11:27 compute-0 systemd[1]: Started Session 18 of User zuul.
Dec 01 09:11:27 compute-0 sshd-session[71300]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:11:27 compute-0 sudo[71376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lppnolrarorbucdyecztpikjsalxuxll ; /usr/bin/python3'
Dec 01 09:11:27 compute-0 sudo[71376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:28 compute-0 useradd[71380]: new group: name=ceph-admin, GID=42478
Dec 01 09:11:28 compute-0 useradd[71380]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Dec 01 09:11:28 compute-0 sudo[71376]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:28 compute-0 sudo[71462]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knuazrrgpwqpsjmmtqdavmhctymmoyql ; /usr/bin/python3'
Dec 01 09:11:28 compute-0 sudo[71462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:28 compute-0 sudo[71462]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:29 compute-0 sudo[71535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdprjtjevxjjrubyqyiswhdccskjaoqm ; /usr/bin/python3'
Dec 01 09:11:29 compute-0 sudo[71535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:29 compute-0 sudo[71535]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:29 compute-0 sudo[71585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgbrrbsiyjhygfyqccvznfrkxyxwemwh ; /usr/bin/python3'
Dec 01 09:11:29 compute-0 sudo[71585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:29 compute-0 sudo[71585]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:29 compute-0 sudo[71611]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klazrlhjsaqadcwajbgrtdepntkirhrn ; /usr/bin/python3'
Dec 01 09:11:29 compute-0 sudo[71611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:30 compute-0 sudo[71611]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:30 compute-0 sudo[71637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aynhfjxbafvamrdirjwbqqtuyinuvbwx ; /usr/bin/python3'
Dec 01 09:11:30 compute-0 sudo[71637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:30 compute-0 sudo[71637]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:30 compute-0 sudo[71663]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfiwedtyosiphvzoeknndqucwokcypus ; /usr/bin/python3'
Dec 01 09:11:30 compute-0 sudo[71663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:30 compute-0 sudo[71663]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:31 compute-0 sudo[71741]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fytgnvvecappoxktrvhvqcrdpyelqhzg ; /usr/bin/python3'
Dec 01 09:11:31 compute-0 sudo[71741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:31 compute-0 sudo[71741]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:31 compute-0 sudo[71814]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-halefjlgohfwdykcanhfkillhvnxnuez ; /usr/bin/python3'
Dec 01 09:11:31 compute-0 sudo[71814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:31 compute-0 sudo[71814]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:31 compute-0 sudo[71916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thuninilrmlshsbqrqyqjlunxrjdjbbo ; /usr/bin/python3'
Dec 01 09:11:31 compute-0 sudo[71916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:32 compute-0 sudo[71916]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:32 compute-0 sudo[71989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oghmnpghvenqkpawdudmujpnojxhftgx ; /usr/bin/python3'
Dec 01 09:11:32 compute-0 sudo[71989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:32 compute-0 sudo[71989]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:32 compute-0 sudo[72039]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkeutltxymqzwfnxbtalnpqkylwxolyi ; /usr/bin/python3'
Dec 01 09:11:32 compute-0 sudo[72039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:33 compute-0 python3[72041]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:11:33 compute-0 sudo[72039]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:34 compute-0 sudo[72134]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aazcgwgnriqxbrycnaqtrsmbtobycalr ; /usr/bin/python3'
Dec 01 09:11:34 compute-0 sudo[72134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:34 compute-0 python3[72136]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 01 09:11:35 compute-0 sudo[72134]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:36 compute-0 sudo[72161]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczhlkfbcxiaeykmpfmwpospkstghiqv ; /usr/bin/python3'
Dec 01 09:11:36 compute-0 sudo[72161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:36 compute-0 python3[72163]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:11:36 compute-0 sudo[72161]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:36 compute-0 sudo[72187]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duwhenhoquyvkqacapugtvcaoeyteraf ; /usr/bin/python3'
Dec 01 09:11:36 compute-0 sudo[72187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:36 compute-0 python3[72189]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:36 compute-0 kernel: loop: module loaded
Dec 01 09:11:36 compute-0 kernel: loop3: detected capacity change from 0 to 41943040
Dec 01 09:11:36 compute-0 sudo[72187]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:36 compute-0 sudo[72223]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnabpykxiwwwmpbrsviacvywmrljkfbf ; /usr/bin/python3'
Dec 01 09:11:36 compute-0 sudo[72223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:36 compute-0 python3[72225]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:36 compute-0 lvm[72228]: PV /dev/loop3 not used.
Dec 01 09:11:37 compute-0 lvm[72237]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:11:37 compute-0 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec 01 09:11:37 compute-0 sudo[72223]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:37 compute-0 lvm[72239]:   1 logical volume(s) in volume group "ceph_vg0" now active
Dec 01 09:11:37 compute-0 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec 01 09:11:37 compute-0 sudo[72315]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwagkaachkhympqdmuefildhygfitqkg ; /usr/bin/python3'
Dec 01 09:11:37 compute-0 sudo[72315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:37 compute-0 python3[72317]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:11:37 compute-0 sudo[72315]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:37 compute-0 sudo[72388]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqqyqliofezbepnpiuirlmdwhjxzpfmf ; /usr/bin/python3'
Dec 01 09:11:37 compute-0 sudo[72388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:37 compute-0 python3[72390]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580297.2150416-36106-155201303425810/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:37 compute-0 sudo[72388]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:38 compute-0 sudo[72438]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ereelekopdlbuiwxfimzjofgylqjfctv ; /usr/bin/python3'
Dec 01 09:11:38 compute-0 sudo[72438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:38 compute-0 python3[72440]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:11:38 compute-0 systemd[1]: Reloading.
Dec 01 09:11:38 compute-0 systemd-rc-local-generator[72470]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:11:38 compute-0 systemd-sysv-generator[72473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:11:38 compute-0 systemd[1]: Starting Ceph OSD losetup...
Dec 01 09:11:38 compute-0 bash[72480]: /dev/loop3: [64513]:4194937 (/var/lib/ceph-osd-0.img)
Dec 01 09:11:38 compute-0 systemd[1]: Finished Ceph OSD losetup.
Dec 01 09:11:38 compute-0 lvm[72481]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:11:38 compute-0 lvm[72481]: VG ceph_vg0 finished
Dec 01 09:11:38 compute-0 sudo[72438]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:38 compute-0 sudo[72505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekhqabekhgtbtqmazdzhwwzxmsgejjin ; /usr/bin/python3'
Dec 01 09:11:38 compute-0 sudo[72505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:39 compute-0 python3[72507]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 01 09:11:40 compute-0 sudo[72505]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:40 compute-0 sudo[72532]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxvlowefuwlujemogujmhqjiczwkpttt ; /usr/bin/python3'
Dec 01 09:11:40 compute-0 sudo[72532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:40 compute-0 python3[72534]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:11:40 compute-0 sudo[72532]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:40 compute-0 sudo[72558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooasdqmoqaytekoynvlmckmppardeavg ; /usr/bin/python3'
Dec 01 09:11:40 compute-0 sudo[72558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:41 compute-0 python3[72560]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=20G
                                          losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:41 compute-0 kernel: loop4: detected capacity change from 0 to 41943040
Dec 01 09:11:41 compute-0 sudo[72558]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:41 compute-0 sudo[72590]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-numxfwpeyivcojirsjaiwlpqkaekqjmm ; /usr/bin/python3'
Dec 01 09:11:41 compute-0 sudo[72590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:41 compute-0 python3[72592]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                          vgcreate ceph_vg1 /dev/loop4
                                          lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:41 compute-0 lvm[72595]: PV /dev/loop4 not used.
Dec 01 09:11:41 compute-0 lvm[72605]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 01 09:11:41 compute-0 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Dec 01 09:11:41 compute-0 sudo[72590]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:41 compute-0 lvm[72607]:   1 logical volume(s) in volume group "ceph_vg1" now active
Dec 01 09:11:41 compute-0 systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Dec 01 09:11:41 compute-0 sudo[72683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uunrijjubqkpqqlqjxwduyzttpfmkcot ; /usr/bin/python3'
Dec 01 09:11:41 compute-0 sudo[72683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:42 compute-0 python3[72685]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:11:42 compute-0 sudo[72683]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:42 compute-0 sudo[72756]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udpqfdrlhroqzfpufmxfldbxsztffxgr ; /usr/bin/python3'
Dec 01 09:11:42 compute-0 sudo[72756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:42 compute-0 python3[72758]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580301.7458215-36133-278484422018864/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:42 compute-0 sudo[72756]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:42 compute-0 sudo[72806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oecmkzldgdeqrdqoilavjqeofhqqerep ; /usr/bin/python3'
Dec 01 09:11:42 compute-0 sudo[72806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:42 compute-0 python3[72808]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:11:42 compute-0 systemd[1]: Reloading.
Dec 01 09:11:42 compute-0 systemd-sysv-generator[72838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:11:42 compute-0 systemd-rc-local-generator[72835]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:11:43 compute-0 systemd[1]: Starting Ceph OSD losetup...
Dec 01 09:11:43 compute-0 bash[72848]: /dev/loop4: [64513]:4327981 (/var/lib/ceph-osd-1.img)
Dec 01 09:11:43 compute-0 systemd[1]: Finished Ceph OSD losetup.
Dec 01 09:11:43 compute-0 lvm[72849]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 01 09:11:43 compute-0 lvm[72849]: VG ceph_vg1 finished
Dec 01 09:11:43 compute-0 sudo[72806]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:43 compute-0 sudo[72873]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkralzpqbafngbexlgecgvnwoparezgg ; /usr/bin/python3'
Dec 01 09:11:43 compute-0 sudo[72873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:43 compute-0 python3[72875]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 01 09:11:44 compute-0 sudo[72873]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:44 compute-0 sudo[72900]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxdlxrtlmdxbxkvcyhjrtknjhsynaaxd ; /usr/bin/python3'
Dec 01 09:11:44 compute-0 sudo[72900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:45 compute-0 python3[72902]: ansible-ansible.builtin.stat Invoked with path=/dev/loop5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:11:45 compute-0 sudo[72900]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:45 compute-0 sudo[72926]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erezsmqnopayezayhhcixljnknfohwtt ; /usr/bin/python3'
Dec 01 09:11:45 compute-0 sudo[72926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:45 compute-0 python3[72928]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-2.img bs=1 count=0 seek=20G
                                          losetup /dev/loop5 /var/lib/ceph-osd-2.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:45 compute-0 kernel: loop5: detected capacity change from 0 to 41943040
Dec 01 09:11:45 compute-0 sudo[72926]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:45 compute-0 sudo[72958]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japtpcjjpfzdtcdhpmvmtafcmxckgcxd ; /usr/bin/python3'
Dec 01 09:11:45 compute-0 sudo[72958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:45 compute-0 python3[72960]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop5
                                          vgcreate ceph_vg2 /dev/loop5
                                          lvcreate -n ceph_lv2 -l +100%FREE ceph_vg2
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:45 compute-0 lvm[72963]: PV /dev/loop5 not used.
Dec 01 09:11:45 compute-0 lvm[72965]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 01 09:11:45 compute-0 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg2.
Dec 01 09:11:45 compute-0 lvm[72975]:   1 logical volume(s) in volume group "ceph_vg2" now active
Dec 01 09:11:46 compute-0 systemd[1]: lvm-activate-ceph_vg2.service: Deactivated successfully.
Dec 01 09:11:46 compute-0 sudo[72958]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:46 compute-0 sudo[73051]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdhnciuvkkupsdatctskdkidbyvrofxa ; /usr/bin/python3'
Dec 01 09:11:46 compute-0 sudo[73051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:46 compute-0 python3[73053]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-2.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:11:46 compute-0 sudo[73051]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:46 compute-0 sudo[73124]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghnzecoskglprppgksmdjrtpletwrhml ; /usr/bin/python3'
Dec 01 09:11:46 compute-0 sudo[73124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:46 compute-0 python3[73126]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580306.152131-36160-179906285034520/source dest=/etc/systemd/system/ceph-osd-losetup-2.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=4c5b1bc5693c499ffe2edaa97d63f5df7075d845 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:46 compute-0 sudo[73124]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:47 compute-0 sudo[73174]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zukhfyeuvzeoeyabzidsiekvniisidqx ; /usr/bin/python3'
Dec 01 09:11:47 compute-0 sudo[73174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:47 compute-0 python3[73176]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-2.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:11:47 compute-0 systemd[1]: Reloading.
Dec 01 09:11:47 compute-0 systemd-sysv-generator[73208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:11:47 compute-0 systemd-rc-local-generator[73205]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:11:47 compute-0 systemd[1]: Starting Ceph OSD losetup...
Dec 01 09:11:47 compute-0 bash[73215]: /dev/loop5: [64513]:4327982 (/var/lib/ceph-osd-2.img)
Dec 01 09:11:47 compute-0 systemd[1]: Finished Ceph OSD losetup.
Dec 01 09:11:47 compute-0 lvm[73216]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 01 09:11:47 compute-0 lvm[73216]: VG ceph_vg2 finished
Dec 01 09:11:47 compute-0 sudo[73174]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:49 compute-0 chronyd[58516]: Selected source 162.159.200.1 (pool.ntp.org)
Dec 01 09:11:49 compute-0 python3[73240]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:11:51 compute-0 sudo[73331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwnbzyjvdsogvidkuluphikehvrajglp ; /usr/bin/python3'
Dec 01 09:11:51 compute-0 sudo[73331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:51 compute-0 python3[73333]: ansible-ansible.legacy.dnf Invoked with name=['cephadm'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 01 09:11:53 compute-0 groupadd[73339]: group added to /etc/group: name=cephadm, GID=992
Dec 01 09:11:53 compute-0 groupadd[73339]: group added to /etc/gshadow: name=cephadm
Dec 01 09:11:53 compute-0 groupadd[73339]: new group: name=cephadm, GID=992
Dec 01 09:11:53 compute-0 useradd[73346]: new user: name=cephadm, UID=992, GID=992, home=/var/lib/cephadm, shell=/bin/bash, from=none
Dec 01 09:11:53 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:11:53 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:11:53 compute-0 sudo[73331]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:53 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:11:53 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:11:53 compute-0 systemd[1]: run-r370aa678a822416aa3de8c07224b703c.service: Deactivated successfully.
Dec 01 09:11:53 compute-0 sudo[73442]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozeqdqqrgiuvrzywaolyntgkfiavmrtv ; /usr/bin/python3'
Dec 01 09:11:53 compute-0 sudo[73442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:53 compute-0 python3[73444]: ansible-ansible.builtin.stat Invoked with path=/usr/sbin/cephadm follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:11:53 compute-0 sudo[73442]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:54 compute-0 sudo[73470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgdjrzvbqewujxudjkeaoigjvakisywz ; /usr/bin/python3'
Dec 01 09:11:54 compute-0 sudo[73470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:54 compute-0 python3[73472]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/cephadm ls --no-detail _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:11:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:11:54 compute-0 sudo[73470]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:54 compute-0 sudo[73533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpjrjazetllscjsqbwbaptmnqaunvoat ; /usr/bin/python3'
Dec 01 09:11:54 compute-0 sudo[73533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:55 compute-0 python3[73535]: ansible-ansible.builtin.file Invoked with path=/etc/ceph state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:55 compute-0 sudo[73533]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:55 compute-0 sudo[73559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apmzgcmaipzmgxqwvtnxgfbtrdowdfbu ; /usr/bin/python3'
Dec 01 09:11:55 compute-0 sudo[73559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:55 compute-0 python3[73561]: ansible-ansible.builtin.file Invoked with path=/home/ceph-admin/specs owner=ceph-admin group=ceph-admin mode=0755 state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:55 compute-0 sudo[73559]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:11:55 compute-0 sudo[73637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwpffnnfhnoxrnjttceukgfloxdhmstl ; /usr/bin/python3'
Dec 01 09:11:55 compute-0 sudo[73637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:55 compute-0 python3[73639]: ansible-ansible.legacy.stat Invoked with path=/home/ceph-admin/specs/ceph_spec.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:11:55 compute-0 sudo[73637]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:56 compute-0 sudo[73710]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdevsqbdiiztqtftjkngstboiurtudtc ; /usr/bin/python3'
Dec 01 09:11:56 compute-0 sudo[73710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:56 compute-0 python3[73712]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580315.6917045-36307-268234461242790/source dest=/home/ceph-admin/specs/ceph_spec.yaml owner=ceph-admin group=ceph-admin mode=0644 _original_basename=ceph_spec.yml follow=False checksum=bb83c53af4ffd926a3f1eafe26a8be437df6401f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:56 compute-0 sudo[73710]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:56 compute-0 sudo[73812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeykyvlbjvedjjngteffxncwpgzvhllx ; /usr/bin/python3'
Dec 01 09:11:56 compute-0 sudo[73812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:57 compute-0 python3[73814]: ansible-ansible.legacy.stat Invoked with path=/home/ceph-admin/assimilate_ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:11:57 compute-0 sudo[73812]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:57 compute-0 sudo[73885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajjkxiyhijsdtmwftchkbhrvryxlpkdr ; /usr/bin/python3'
Dec 01 09:11:57 compute-0 sudo[73885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:57 compute-0 python3[73887]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580316.8206906-36325-132693539641527/source dest=/home/ceph-admin/assimilate_ceph.conf owner=ceph-admin group=ceph-admin mode=0644 _original_basename=initial_ceph.conf follow=False checksum=41828f7c2442fdf376911255e33c12863fc3b1b3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:11:57 compute-0 sudo[73885]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:57 compute-0 sudo[73935]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqzidxcazruygihukdoycrpxyyuznnls ; /usr/bin/python3'
Dec 01 09:11:57 compute-0 sudo[73935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:57 compute-0 python3[73937]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/.ssh/id_rsa follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:11:57 compute-0 sudo[73935]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:57 compute-0 sudo[73963]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idydebbvamzusqjzsezwqpwxsphifzay ; /usr/bin/python3'
Dec 01 09:11:57 compute-0 sudo[73963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:58 compute-0 python3[73965]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/.ssh/id_rsa.pub follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:11:58 compute-0 sudo[73963]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:58 compute-0 sudo[73991]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wacruwpiancjjvwrkbuotexuyolntdoy ; /usr/bin/python3'
Dec 01 09:11:58 compute-0 sudo[73991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:58 compute-0 python3[73993]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/assimilate_ceph.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:11:58 compute-0 sudo[73991]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:58 compute-0 sudo[74019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfcmmfywphdwcgngfjdddvgujlfffavk ; /usr/bin/python3'
Dec 01 09:11:58 compute-0 sudo[74019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:11:58 compute-0 python3[74021]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/cephadm bootstrap --skip-firewalld --skip-prepare-host --ssh-private-key /home/ceph-admin/.ssh/id_rsa --ssh-public-key /home/ceph-admin/.ssh/id_rsa.pub --ssh-user ceph-admin --allow-fqdn-hostname --output-keyring /etc/ceph/ceph.client.admin.keyring --output-config /etc/ceph/ceph.conf --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config /home/ceph-admin/assimilate_ceph.conf \--single-host-defaults \--skip-monitoring-stack --skip-dashboard --mon-ip 192.168.122.100
                                           _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:11:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:11:58 compute-0 sshd-session[74036]: Accepted publickey for ceph-admin from 192.168.122.100 port 42934 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:11:59 compute-0 systemd-logind[788]: New session 19 of user ceph-admin.
Dec 01 09:11:59 compute-0 systemd[1]: Created slice User Slice of UID 42477.
Dec 01 09:11:59 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 01 09:11:59 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 01 09:11:59 compute-0 systemd[1]: Starting User Manager for UID 42477...
Dec 01 09:11:59 compute-0 systemd[74040]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:11:59 compute-0 systemd[74040]: Queued start job for default target Main User Target.
Dec 01 09:11:59 compute-0 systemd[74040]: Created slice User Application Slice.
Dec 01 09:11:59 compute-0 systemd[74040]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 01 09:11:59 compute-0 systemd[74040]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 09:11:59 compute-0 systemd[74040]: Reached target Paths.
Dec 01 09:11:59 compute-0 systemd[74040]: Reached target Timers.
Dec 01 09:11:59 compute-0 systemd[74040]: Starting D-Bus User Message Bus Socket...
Dec 01 09:11:59 compute-0 systemd[74040]: Starting Create User's Volatile Files and Directories...
Dec 01 09:11:59 compute-0 systemd[74040]: Listening on D-Bus User Message Bus Socket.
Dec 01 09:11:59 compute-0 systemd[74040]: Reached target Sockets.
Dec 01 09:11:59 compute-0 systemd[74040]: Finished Create User's Volatile Files and Directories.
Dec 01 09:11:59 compute-0 systemd[74040]: Reached target Basic System.
Dec 01 09:11:59 compute-0 systemd[74040]: Reached target Main User Target.
Dec 01 09:11:59 compute-0 systemd[74040]: Startup finished in 121ms.
Dec 01 09:11:59 compute-0 systemd[1]: Started User Manager for UID 42477.
Dec 01 09:11:59 compute-0 systemd[1]: Started Session 19 of User ceph-admin.
Dec 01 09:11:59 compute-0 sshd-session[74036]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:11:59 compute-0 sudo[74056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/echo
Dec 01 09:11:59 compute-0 sudo[74056]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:11:59 compute-0 sudo[74056]: pam_unix(sudo:session): session closed for user root
Dec 01 09:11:59 compute-0 sshd-session[74055]: Received disconnect from 192.168.122.100 port 42934:11: disconnected by user
Dec 01 09:11:59 compute-0 sshd-session[74055]: Disconnected from user ceph-admin 192.168.122.100 port 42934
Dec 01 09:11:59 compute-0 sshd-session[74036]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:11:59 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Dec 01 09:11:59 compute-0 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Dec 01 09:11:59 compute-0 systemd-logind[788]: Removed session 19.
Dec 01 09:12:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat4187034717-merged.mount: Deactivated successfully.
Dec 01 09:12:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat4187034717-lower\x2dmapped.mount: Deactivated successfully.
Dec 01 09:12:09 compute-0 systemd[1]: Stopping User Manager for UID 42477...
Dec 01 09:12:09 compute-0 systemd[74040]: Activating special unit Exit the Session...
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped target Main User Target.
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped target Basic System.
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped target Paths.
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped target Sockets.
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped target Timers.
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 01 09:12:09 compute-0 systemd[74040]: Closed D-Bus User Message Bus Socket.
Dec 01 09:12:09 compute-0 systemd[74040]: Stopped Create User's Volatile Files and Directories.
Dec 01 09:12:09 compute-0 systemd[74040]: Removed slice User Application Slice.
Dec 01 09:12:09 compute-0 systemd[74040]: Reached target Shutdown.
Dec 01 09:12:09 compute-0 systemd[74040]: Finished Exit the Session.
Dec 01 09:12:09 compute-0 systemd[74040]: Reached target Exit the Session.
Dec 01 09:12:09 compute-0 systemd[1]: user@42477.service: Deactivated successfully.
Dec 01 09:12:09 compute-0 systemd[1]: Stopped User Manager for UID 42477.
Dec 01 09:12:09 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec 01 09:12:09 compute-0 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec 01 09:12:09 compute-0 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec 01 09:12:09 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec 01 09:12:09 compute-0 systemd[1]: Removed slice User Slice of UID 42477.
Dec 01 09:12:16 compute-0 podman[74093]: 2025-12-01 09:12:16.851549721 +0000 UTC m=+17.513510633 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:12:16 compute-0 podman[74153]: 2025-12-01 09:12:16.913325658 +0000 UTC m=+0.039011561 container create 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:12:16 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 01 09:12:16 compute-0 systemd[1]: Started libpod-conmon-27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa.scope.
Dec 01 09:12:16 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:16 compute-0 podman[74153]: 2025-12-01 09:12:16.896904341 +0000 UTC m=+0.022590264 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:17 compute-0 podman[74153]: 2025-12-01 09:12:17.009672137 +0000 UTC m=+0.135358060 container init 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:12:17 compute-0 podman[74153]: 2025-12-01 09:12:17.01644614 +0000 UTC m=+0.142132043 container start 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:12:17 compute-0 podman[74153]: 2025-12-01 09:12:17.019602359 +0000 UTC m=+0.145288262 container attach 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:12:17 compute-0 keen_hawking[74169]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)
Dec 01 09:12:17 compute-0 systemd[1]: libpod-27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74153]: 2025-12-01 09:12:17.353988947 +0000 UTC m=+0.479674850 container died 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:12:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-c97369b947701ee14cc2d4e17e340c0afdd338111f12620cda00af2df1961ddc-merged.mount: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74153]: 2025-12-01 09:12:17.396317271 +0000 UTC m=+0.522003174 container remove 27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa (image=quay.io/ceph/ceph:v18, name=keen_hawking, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 09:12:17 compute-0 systemd[1]: libpod-conmon-27845bd968ab9d38165dd71b10a784f0e71a9d12fc6516a43965b6283efb91aa.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74184]: 2025-12-01 09:12:17.458692154 +0000 UTC m=+0.043256731 container create e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:12:17 compute-0 systemd[1]: Started libpod-conmon-e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15.scope.
Dec 01 09:12:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:17 compute-0 podman[74184]: 2025-12-01 09:12:17.523982561 +0000 UTC m=+0.108547138 container init e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:17 compute-0 podman[74184]: 2025-12-01 09:12:17.531824714 +0000 UTC m=+0.116389281 container start e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:17 compute-0 podman[74184]: 2025-12-01 09:12:17.438606123 +0000 UTC m=+0.023170710 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:17 compute-0 podman[74184]: 2025-12-01 09:12:17.534708896 +0000 UTC m=+0.119273463 container attach e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:12:17 compute-0 modest_goodall[74200]: 167 167
Dec 01 09:12:17 compute-0 systemd[1]: libpod-e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74184]: 2025-12-01 09:12:17.535842038 +0000 UTC m=+0.120406625 container died e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:12:17 compute-0 podman[74184]: 2025-12-01 09:12:17.566662194 +0000 UTC m=+0.151226761 container remove e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15 (image=quay.io/ceph/ceph:v18, name=modest_goodall, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 01 09:12:17 compute-0 systemd[1]: libpod-conmon-e1a67daf60be357b6e8b9c85837d761d93d23614e8460d82b6f386ece625fa15.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74217]: 2025-12-01 09:12:17.62597045 +0000 UTC m=+0.042411957 container create 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 01 09:12:17 compute-0 systemd[1]: Started libpod-conmon-7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66.scope.
Dec 01 09:12:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:17 compute-0 podman[74217]: 2025-12-01 09:12:17.674739457 +0000 UTC m=+0.091180964 container init 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:12:17 compute-0 podman[74217]: 2025-12-01 09:12:17.679382469 +0000 UTC m=+0.095823976 container start 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:12:17 compute-0 podman[74217]: 2025-12-01 09:12:17.682677923 +0000 UTC m=+0.099119430 container attach 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec 01 09:12:17 compute-0 quirky_khorana[74234]: AQDxWy1pI9WqKRAAmTLyJtNSzDfZ1RFM9WLt+A==
Dec 01 09:12:17 compute-0 podman[74217]: 2025-12-01 09:12:17.602890404 +0000 UTC m=+0.019331931 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:17 compute-0 systemd[1]: libpod-7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74217]: 2025-12-01 09:12:17.702007612 +0000 UTC m=+0.118449119 container died 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:12:17 compute-0 podman[74217]: 2025-12-01 09:12:17.733491148 +0000 UTC m=+0.149932655 container remove 7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66 (image=quay.io/ceph/ceph:v18, name=quirky_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:12:17 compute-0 systemd[1]: libpod-conmon-7c4032f6a2c6a884e19e10b8940d83378102206b4b3376a7099669e22afc2d66.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74252]: 2025-12-01 09:12:17.784861678 +0000 UTC m=+0.034007928 container create 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:12:17 compute-0 systemd[1]: Started libpod-conmon-8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f.scope.
Dec 01 09:12:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:17 compute-0 podman[74252]: 2025-12-01 09:12:17.83558384 +0000 UTC m=+0.084730120 container init 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:12:17 compute-0 podman[74252]: 2025-12-01 09:12:17.840984704 +0000 UTC m=+0.090130944 container start 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:12:17 compute-0 podman[74252]: 2025-12-01 09:12:17.844408581 +0000 UTC m=+0.093554851 container attach 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:17 compute-0 reverent_banach[74268]: AQDxWy1pP3FMMxAAfPV/KsFRiaST9nKEJ2+JCg==
Dec 01 09:12:17 compute-0 systemd[1]: libpod-8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74252]: 2025-12-01 09:12:17.863820833 +0000 UTC m=+0.112967083 container died 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:12:17 compute-0 podman[74252]: 2025-12-01 09:12:17.769839011 +0000 UTC m=+0.018985281 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-a105e0c8dd2ddef39ac4794ab31d183fe03c16bf9f2c14788fd5ed9a85f5209c-merged.mount: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74252]: 2025-12-01 09:12:17.897687016 +0000 UTC m=+0.146833266 container remove 8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f (image=quay.io/ceph/ceph:v18, name=reverent_banach, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:12:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:12:17 compute-0 systemd[1]: libpod-conmon-8ba8f8fe46e2918d1e083809ebb3c357ea167382bf98322cf914474fde972c7f.scope: Deactivated successfully.
Dec 01 09:12:17 compute-0 podman[74289]: 2025-12-01 09:12:17.953772801 +0000 UTC m=+0.038607009 container create 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:12:17 compute-0 systemd[1]: Started libpod-conmon-8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611.scope.
Dec 01 09:12:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:18 compute-0 podman[74289]: 2025-12-01 09:12:18.007680574 +0000 UTC m=+0.092514782 container init 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:12:18 compute-0 podman[74289]: 2025-12-01 09:12:18.013007115 +0000 UTC m=+0.097841323 container start 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec 01 09:12:18 compute-0 podman[74289]: 2025-12-01 09:12:18.01600266 +0000 UTC m=+0.100836898 container attach 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:12:18 compute-0 podman[74289]: 2025-12-01 09:12:17.936910711 +0000 UTC m=+0.021744939 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:18 compute-0 youthful_swirles[74306]: AQDyWy1pRr36ARAAZpgZl//Z9xb8bhYxEGCeew==
Dec 01 09:12:18 compute-0 systemd[1]: libpod-8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611.scope: Deactivated successfully.
Dec 01 09:12:18 compute-0 podman[74289]: 2025-12-01 09:12:18.03709935 +0000 UTC m=+0.121933558 container died 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:12:18 compute-0 podman[74289]: 2025-12-01 09:12:18.067021731 +0000 UTC m=+0.151855939 container remove 8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611 (image=quay.io/ceph/ceph:v18, name=youthful_swirles, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:12:18 compute-0 systemd[1]: libpod-conmon-8114f80179ff9c7efcef03e5e061877fd2298e7dbd9afb09398203c9a41e9611.scope: Deactivated successfully.
Dec 01 09:12:18 compute-0 podman[74324]: 2025-12-01 09:12:18.119576545 +0000 UTC m=+0.035806179 container create 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:18 compute-0 systemd[1]: Started libpod-conmon-7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465.scope.
Dec 01 09:12:18 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5960d03bd29c24ef9aef0727615e0a698c519e73fbc9cf4fc532b37b74ef8a80/merged/tmp/monmap supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:18 compute-0 podman[74324]: 2025-12-01 09:12:18.171684397 +0000 UTC m=+0.087914051 container init 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:12:18 compute-0 podman[74324]: 2025-12-01 09:12:18.176424682 +0000 UTC m=+0.092654316 container start 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:12:18 compute-0 podman[74324]: 2025-12-01 09:12:18.178999375 +0000 UTC m=+0.095228999 container attach 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Dec 01 09:12:18 compute-0 podman[74324]: 2025-12-01 09:12:18.104647711 +0000 UTC m=+0.020877375 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:18 compute-0 trusting_margulis[74340]: /usr/bin/monmaptool: monmap file /tmp/monmap
Dec 01 09:12:18 compute-0 trusting_margulis[74340]: setting min_mon_release = pacific
Dec 01 09:12:18 compute-0 trusting_margulis[74340]: /usr/bin/monmaptool: set fsid to 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:18 compute-0 trusting_margulis[74340]: /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors)
Dec 01 09:12:18 compute-0 systemd[1]: libpod-7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465.scope: Deactivated successfully.
Dec 01 09:12:18 compute-0 podman[74324]: 2025-12-01 09:12:18.208229096 +0000 UTC m=+0.124458730 container died 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:12:18 compute-0 podman[74324]: 2025-12-01 09:12:18.236249433 +0000 UTC m=+0.152479067 container remove 7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465 (image=quay.io/ceph/ceph:v18, name=trusting_margulis, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:18 compute-0 systemd[1]: libpod-conmon-7884ab784b64ea476ec2ea88d00e7d9b2a2875bf37169870b0d91971ac954465.scope: Deactivated successfully.
Dec 01 09:12:18 compute-0 podman[74359]: 2025-12-01 09:12:18.297964177 +0000 UTC m=+0.043401695 container create b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:18 compute-0 systemd[1]: Started libpod-conmon-b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79.scope.
Dec 01 09:12:18 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/tmp/monmap supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/084135f1ce3964ab6032809724ca6bef79e7091cafc3fc73faef4360db430854/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:18 compute-0 podman[74359]: 2025-12-01 09:12:18.349175603 +0000 UTC m=+0.094613041 container init b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:12:18 compute-0 podman[74359]: 2025-12-01 09:12:18.356262095 +0000 UTC m=+0.101699503 container start b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:18 compute-0 podman[74359]: 2025-12-01 09:12:18.360057293 +0000 UTC m=+0.105494691 container attach b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:12:18 compute-0 podman[74359]: 2025-12-01 09:12:18.278200185 +0000 UTC m=+0.023637613 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:18 compute-0 systemd[1]: libpod-b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79.scope: Deactivated successfully.
Dec 01 09:12:18 compute-0 podman[74359]: 2025-12-01 09:12:18.419667798 +0000 UTC m=+0.165105206 container died b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec 01 09:12:18 compute-0 podman[74359]: 2025-12-01 09:12:18.456366361 +0000 UTC m=+0.201803769 container remove b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79 (image=quay.io/ceph/ceph:v18, name=objective_jackson, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:18 compute-0 systemd[1]: libpod-conmon-b81ce7f0274254732b370fffb9e363b8424acca1ba99f68026901d079903ef79.scope: Deactivated successfully.
Dec 01 09:12:18 compute-0 systemd[1]: Reloading.
Dec 01 09:12:18 compute-0 systemd-rc-local-generator[74443]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:12:18 compute-0 systemd-sysv-generator[74446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:12:18 compute-0 systemd[1]: Reloading.
Dec 01 09:12:18 compute-0 systemd-rc-local-generator[74480]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:12:18 compute-0 systemd-sysv-generator[74483]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:12:18 compute-0 systemd[1]: Reached target All Ceph clusters and services.
Dec 01 09:12:18 compute-0 systemd[1]: Reloading.
Dec 01 09:12:19 compute-0 systemd-rc-local-generator[74517]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:12:19 compute-0 systemd-sysv-generator[74521]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:12:19 compute-0 systemd[1]: Reached target Ceph cluster 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:12:19 compute-0 systemd[1]: Reloading.
Dec 01 09:12:19 compute-0 systemd-rc-local-generator[74557]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:12:19 compute-0 systemd-sysv-generator[74561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:12:19 compute-0 systemd[1]: Reloading.
Dec 01 09:12:19 compute-0 systemd-rc-local-generator[74595]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:12:19 compute-0 systemd-sysv-generator[74600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:12:19 compute-0 systemd[1]: Created slice Slice /system/ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:12:19 compute-0 systemd[1]: Reached target System Time Set.
Dec 01 09:12:19 compute-0 systemd[1]: Reached target System Time Synchronized.
Dec 01 09:12:19 compute-0 systemd[1]: Starting Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:12:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:12:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:12:19 compute-0 podman[74654]: 2025-12-01 09:12:19.952789128 +0000 UTC m=+0.049421386 container create cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 podman[74654]: 2025-12-01 09:12:20.014472262 +0000 UTC m=+0.111104550 container init cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:12:20 compute-0 podman[74654]: 2025-12-01 09:12:20.019707151 +0000 UTC m=+0.116339399 container start cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:12:20 compute-0 podman[74654]: 2025-12-01 09:12:19.929529227 +0000 UTC m=+0.026161505 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:20 compute-0 bash[74654]: cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855
Dec 01 09:12:20 compute-0 systemd[1]: Started Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:12:20 compute-0 ceph-mon[74672]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: pidfile_write: ignore empty --pid-file
Dec 01 09:12:20 compute-0 ceph-mon[74672]: load: jerasure load: lrc 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Git sha 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: DB SUMMARY
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: DB Session ID:  3E2ZTRY0NM64UCA3TJW4
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-0/store.db dir, Total Num: 0, files: 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-0/store.db: 000004.log size: 807 ; 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                                     Options.env: 0x557647b36c40
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                                Options.info_log: 0x557649962e80
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                                 Options.wal_dir: 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                    Options.write_buffer_manager: 0x557649972b40
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                               Options.row_cache: None
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                              Options.wal_filter: None
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.wal_compression: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.max_background_jobs: 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.max_total_wal_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:       Options.compaction_readahead_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Compression algorithms supported:
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kZSTD supported: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000005
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:           Options.merge_operator: 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:        Options.compaction_filter: None
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557649962a80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55764995b1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:        Options.write_buffer_size: 33554432
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:  Options.max_write_buffer_number: 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:          Options.compression: NoCompression
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.num_levels: 7
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 45d3ecca-3e60-40df-8d21-b0b3630e7b99
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580340065528, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580340067969, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1944, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 819, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 696, "raw_average_value_size": 139, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "3E2ZTRY0NM64UCA3TJW4", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580340068175, "job": 1, "event": "recovery_finished"}
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557649984e00
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: DB pointer 0x557649a8e000
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:12:20 compute-0 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.90 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.90 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.16 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.16 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55764995b1f0#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.95 KB,0.000181794%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 09:12:20 compute-0 ceph-mon[74672]: starting mon.compute-0 rank 0 at public addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] at bind addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-0 fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@-1(???) e0 preinit fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@-1(probing) e0  my rank is now 0 (was -1)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(probing) e0 win_standalone_election
Dec 01 09:12:20 compute-0 ceph-mon[74672]: paxos.0).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(electing) e0 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:12:20 compute-0 ceph-mon[74672]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting backfillfull_ratio = 0.9
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting full_ratio = 0.95
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting nearfull_ratio = 0.85
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 do_prune osdmap full prune enabled
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e0 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).paxosservice(auth 0..0) refresh upgraded, format 3 -> 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(probing) e1 win_standalone_election
Dec 01 09:12:20 compute-0 ceph-mon[74672]: paxos.0).electionLogic(2) init, last seen epoch 2
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(electing) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:12:20 compute-0 ceph-mon[74672]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: log_channel(cluster) log [DBG] : monmap e1: 1 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]} removed_ranks: {} disallowed_leaders: {}
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mgrc update_daemon_metadata mon.compute-0 metadata {addrs=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-0,container_image=quay.io/ceph/ceph:v18,cpu=AMD EPYC-Rome Processor,created_at=2025-12-01T09:12:18.391342Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-0,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,os=Linux}
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting backfillfull_ratio = 0.9
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting full_ratio = 0.95
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 create_pending setting nearfull_ratio = 0.85
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 do_prune osdmap full prune enabled
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e1 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).mds e1 new map
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).mds e1 print_map
                                           e1
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).paxosservice(auth 0..0) refresh upgraded, format 3 -> 0
Dec 01 09:12:20 compute-0 ceph-mon[74672]: log_channel(cluster) log [DBG] : fsmap 
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 e1: 0 total, 0 up, 0 in
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 3314932999778484224, adjusting msgr requires
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mkfs 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader).paxosservice(auth 1..1) refresh upgraded, format 0 -> 3
Dec 01 09:12:20 compute-0 ceph-mon[74672]: log_channel(cluster) log [DBG] : osdmap e1: 0 total, 0 up, 0 in
Dec 01 09:12:20 compute-0 podman[74676]: 2025-12-01 09:12:20.115976008 +0000 UTC m=+0.039582376 container create c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: log_channel(cluster) log [DBG] : mgrmap e1: no daemons active
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 01 09:12:20 compute-0 systemd[1]: Started libpod-conmon-c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2.scope.
Dec 01 09:12:20 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 podman[74676]: 2025-12-01 09:12:20.098724988 +0000 UTC m=+0.022331376 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:20 compute-0 podman[74676]: 2025-12-01 09:12:20.197140716 +0000 UTC m=+0.120747084 container init c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:20 compute-0 podman[74676]: 2025-12-01 09:12:20.205765631 +0000 UTC m=+0.129372000 container start c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:12:20 compute-0 podman[74676]: 2025-12-01 09:12:20.208800348 +0000 UTC m=+0.132406716 container attach c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:20 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0) v1
Dec 01 09:12:20 compute-0 ceph-mon[74672]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1884213047' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:   cluster:
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     id:     5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     health: HEALTH_OK
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:  
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:   services:
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     mon: 1 daemons, quorum compute-0 (age 0.516339s)
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     mgr: no daemons active
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     osd: 0 osds: 0 up, 0 in
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:  
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:   data:
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     pools:   0 pools, 0 pgs
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     objects: 0 objects, 0 B
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     usage:   0 B used, 0 B / 0 B avail
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:     pgs:     
Dec 01 09:12:20 compute-0 nifty_matsumoto[74725]:  
Dec 01 09:12:20 compute-0 systemd[1]: libpod-c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2.scope: Deactivated successfully.
Dec 01 09:12:20 compute-0 podman[74676]: 2025-12-01 09:12:20.632728271 +0000 UTC m=+0.556334629 container died c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-ab55e3c47e9c0f75159d73f9de288637a04c584ef936f9586d99797e42236072-merged.mount: Deactivated successfully.
Dec 01 09:12:20 compute-0 podman[74676]: 2025-12-01 09:12:20.681709774 +0000 UTC m=+0.605316132 container remove c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2 (image=quay.io/ceph/ceph:v18, name=nifty_matsumoto, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:20 compute-0 systemd[1]: libpod-conmon-c5b728eea6004060c2c79ddc5c6e4d77ad9f84ee48b08476c572a27f3cf33eb2.scope: Deactivated successfully.
Dec 01 09:12:20 compute-0 podman[74768]: 2025-12-01 09:12:20.743792709 +0000 UTC m=+0.042966052 container create c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:20 compute-0 systemd[1]: Started libpod-conmon-c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2.scope.
Dec 01 09:12:20 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:20 compute-0 podman[74768]: 2025-12-01 09:12:20.815040125 +0000 UTC m=+0.114213478 container init c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:12:20 compute-0 podman[74768]: 2025-12-01 09:12:20.723648677 +0000 UTC m=+0.022822050 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:20 compute-0 podman[74768]: 2025-12-01 09:12:20.819957995 +0000 UTC m=+0.119131338 container start c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec 01 09:12:20 compute-0 podman[74768]: 2025-12-01 09:12:20.823359112 +0000 UTC m=+0.122532455 container attach c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:21 compute-0 ceph-mon[74672]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 01 09:12:21 compute-0 ceph-mon[74672]: monmap e1: 1 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]} removed_ranks: {} disallowed_leaders: {}
Dec 01 09:12:21 compute-0 ceph-mon[74672]: fsmap 
Dec 01 09:12:21 compute-0 ceph-mon[74672]: osdmap e1: 0 total, 0 up, 0 in
Dec 01 09:12:21 compute-0 ceph-mon[74672]: mgrmap e1: no daemons active
Dec 01 09:12:21 compute-0 ceph-mon[74672]: from='client.? 192.168.122.100:0/1884213047' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:12:21 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config assimilate-conf"} v 0) v1
Dec 01 09:12:21 compute-0 ceph-mon[74672]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4273641964' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec 01 09:12:21 compute-0 ceph-mon[74672]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4273641964' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Dec 01 09:12:21 compute-0 competent_albattani[74786]: 
Dec 01 09:12:21 compute-0 competent_albattani[74786]: [global]
Dec 01 09:12:21 compute-0 competent_albattani[74786]:         fsid = 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:21 compute-0 competent_albattani[74786]:         mon_host = [v2:192.168.122.100:3300,v1:192.168.122.100:6789]
Dec 01 09:12:21 compute-0 competent_albattani[74786]:         osd_crush_chooseleaf_type = 0
Dec 01 09:12:21 compute-0 systemd[1]: libpod-c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2.scope: Deactivated successfully.
Dec 01 09:12:21 compute-0 podman[74768]: 2025-12-01 09:12:21.23336844 +0000 UTC m=+0.532541793 container died c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:12:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-1a88ca7a93efb54ff26eec6476c77042d20cfef9fdedbd701a4f424ef3e3da96-merged.mount: Deactivated successfully.
Dec 01 09:12:21 compute-0 podman[74768]: 2025-12-01 09:12:21.276972109 +0000 UTC m=+0.576145442 container remove c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2 (image=quay.io/ceph/ceph:v18, name=competent_albattani, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:12:21 compute-0 systemd[1]: libpod-conmon-c5a379453b4f239935113a32f2d0cae4e9c065b0785ef968659a4099275f41a2.scope: Deactivated successfully.
Dec 01 09:12:21 compute-0 podman[74823]: 2025-12-01 09:12:21.336109291 +0000 UTC m=+0.038999280 container create ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:21 compute-0 systemd[1]: Started libpod-conmon-ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa.scope.
Dec 01 09:12:21 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:21 compute-0 podman[74823]: 2025-12-01 09:12:21.411772032 +0000 UTC m=+0.114662071 container init ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:12:21 compute-0 podman[74823]: 2025-12-01 09:12:21.32025736 +0000 UTC m=+0.023147369 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:21 compute-0 podman[74823]: 2025-12-01 09:12:21.421436197 +0000 UTC m=+0.124326186 container start ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:12:21 compute-0 podman[74823]: 2025-12-01 09:12:21.424780002 +0000 UTC m=+0.127669991 container attach ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:12:21 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:12:21 compute-0 ceph-mon[74672]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3612672887' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:12:21 compute-0 systemd[1]: libpod-ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa.scope: Deactivated successfully.
Dec 01 09:12:21 compute-0 podman[74823]: 2025-12-01 09:12:21.823458608 +0000 UTC m=+0.526348597 container died ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:12:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-ab92c57de76ca6b6240b70a4f7d798d245ca818f7fb1dae48860aede3acf2bd4-merged.mount: Deactivated successfully.
Dec 01 09:12:21 compute-0 podman[74823]: 2025-12-01 09:12:21.868666113 +0000 UTC m=+0.571556102 container remove ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa (image=quay.io/ceph/ceph:v18, name=quirky_khorana, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:12:21 compute-0 systemd[1]: libpod-conmon-ea55ad1b34f0480a2640fc2e540dc6a63c6c2d6fbb7ceeedce0dfd7ddd8d4caa.scope: Deactivated successfully.
Dec 01 09:12:21 compute-0 systemd[1]: Stopping Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:12:22 compute-0 ceph-mon[74672]: received  signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.compute-0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false  (PID: 1) UID: 0
Dec 01 09:12:22 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e1 *** Got Signal Terminated ***
Dec 01 09:12:22 compute-0 ceph-mon[74672]: mon.compute-0@0(leader) e1 shutdown
Dec 01 09:12:22 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0[74668]: 2025-12-01T09:12:22.050+0000 7f739e8c7640 -1 received  signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.compute-0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false  (PID: 1) UID: 0
Dec 01 09:12:22 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0[74668]: 2025-12-01T09:12:22.050+0000 7f739e8c7640 -1 mon.compute-0@0(leader) e1 *** Got Signal Terminated ***
Dec 01 09:12:22 compute-0 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 01 09:12:22 compute-0 ceph-mon[74672]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 01 09:12:22 compute-0 podman[74907]: 2025-12-01 09:12:22.306001828 +0000 UTC m=+0.287144645 container died cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:12:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c3e450d579be7900aad28102be8a81fcc3c93003a0b07817df9ad5b3f76dae6-merged.mount: Deactivated successfully.
Dec 01 09:12:22 compute-0 podman[74907]: 2025-12-01 09:12:22.34264093 +0000 UTC m=+0.323783757 container remove cefa5d72af91d86283f64418a14a9aa1bf5270d72d08d9ff1301e3ba772fd855 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:22 compute-0 bash[74907]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0
Dec 01 09:12:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:12:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:12:22 compute-0 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mon.compute-0.service: Deactivated successfully.
Dec 01 09:12:22 compute-0 systemd[1]: Stopped Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:12:22 compute-0 systemd[1]: Starting Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:12:22 compute-0 podman[75011]: 2025-12-01 09:12:22.663057701 +0000 UTC m=+0.034416970 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:22 compute-0 podman[75011]: 2025-12-01 09:12:22.776721221 +0000 UTC m=+0.148080490 container create a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25e1f43ce5c8d6f2ee734a07843f5f828178b98c2cd26e90afe9298f37e21e66/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:22 compute-0 podman[75011]: 2025-12-01 09:12:22.835632557 +0000 UTC m=+0.206991836 container init a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:12:22 compute-0 podman[75011]: 2025-12-01 09:12:22.840773553 +0000 UTC m=+0.212132812 container start a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:22 compute-0 bash[75011]: a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7
Dec 01 09:12:22 compute-0 systemd[1]: Started Ceph mon.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:12:22 compute-0 ceph-mon[75031]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: pidfile_write: ignore empty --pid-file
Dec 01 09:12:22 compute-0 ceph-mon[75031]: load: jerasure load: lrc 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Git sha 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: DB SUMMARY
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: DB Session ID:  2DUIFG3VBWNEITLEK8RC
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: MANIFEST file:  MANIFEST-000010 size: 179 Bytes
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-0/store.db dir, Total Num: 1, files: 000008.sst 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-0/store.db: 000009.log size: 52078 ; 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                                     Options.env: 0x55bbd48ffc40
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                                Options.info_log: 0x55bbd56bd040
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                                 Options.wal_dir: 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                    Options.write_buffer_manager: 0x55bbd56ccb40
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                               Options.row_cache: None
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                              Options.wal_filter: None
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.wal_compression: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.max_background_jobs: 2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.max_total_wal_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:       Options.compaction_readahead_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Compression algorithms supported:
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kZSTD supported: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000010
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:           Options.merge_operator: 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:        Options.compaction_filter: None
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bbd56bcc40)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55bbd56b51f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:        Options.write_buffer_size: 33554432
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:  Options.max_write_buffer_number: 2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:          Options.compression: NoCompression
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.num_levels: 7
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 45d3ecca-3e60-40df-8d21-b0b3630e7b99
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580342880545, "job": 1, "event": "recovery_started", "wal_files": [9]}
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580342883395, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 51794, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 129, "table_properties": {"data_size": 50351, "index_size": 149, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 261, "raw_key_size": 2940, "raw_average_key_size": 30, "raw_value_size": 48030, "raw_average_value_size": 500, "num_data_blocks": 7, "num_entries": 96, "num_filter_entries": 96, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580342, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580342883504, "job": 1, "event": "recovery_finished"}
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:5047] Creating manifest 15
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55bbd56dee00
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: DB pointer 0x55bbd5768000
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:12:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0   52.48 KB   0.5      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      2/0   52.48 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 4.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 4.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55bbd56b51f0#2 capacity: 512.00 MB usage: 0.77 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(2,0.42 KB,8.04663e-05%) IndexBlock(2,0.34 KB,6.55651e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 09:12:22 compute-0 ceph-mon[75031]: starting mon.compute-0 rank 0 at public addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] at bind addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-0 fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???) e1 preinit fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???).mds e1 new map
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???).mds e1 print_map
                                           e1
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@-1(probing) e1  my rank is now 0 (was -1)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@0(probing) e1 win_standalone_election
Dec 01 09:12:22 compute-0 ceph-mon[75031]: paxos.0).electionLogic(3) init, last seen epoch 3, mid-election, bumping
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@0(electing) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:12:22 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : monmap e1: 1 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]} removed_ranks: {} disallowed_leaders: {}
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:12:22 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e1: 0 total, 0 up, 0 in
Dec 01 09:12:22 compute-0 podman[75032]: 2025-12-01 09:12:22.905702989 +0000 UTC m=+0.039463353 container create 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e1: no daemons active
Dec 01 09:12:22 compute-0 systemd[1]: Started libpod-conmon-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope.
Dec 01 09:12:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 01 09:12:22 compute-0 ceph-mon[75031]: monmap e1: 1 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]} removed_ranks: {} disallowed_leaders: {}
Dec 01 09:12:22 compute-0 ceph-mon[75031]: fsmap 
Dec 01 09:12:22 compute-0 ceph-mon[75031]: osdmap e1: 0 total, 0 up, 0 in
Dec 01 09:12:22 compute-0 ceph-mon[75031]: mgrmap e1: no daemons active
Dec 01 09:12:22 compute-0 podman[75032]: 2025-12-01 09:12:22.964939073 +0000 UTC m=+0.098699437 container init 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec 01 09:12:22 compute-0 podman[75032]: 2025-12-01 09:12:22.970039608 +0000 UTC m=+0.103799972 container start 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:22 compute-0 podman[75032]: 2025-12-01 09:12:22.973117216 +0000 UTC m=+0.106877610 container attach 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:12:22 compute-0 podman[75032]: 2025-12-01 09:12:22.889254611 +0000 UTC m=+0.023014975 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=public_network}] v 0) v1
Dec 01 09:12:23 compute-0 systemd[1]: libpod-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope: Deactivated successfully.
Dec 01 09:12:23 compute-0 conmon[75086]: conmon 8cd0c44f0f4f3aef750b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope/container/memory.events
Dec 01 09:12:23 compute-0 podman[75032]: 2025-12-01 09:12:23.390853613 +0000 UTC m=+0.524613987 container died 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:12:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-4f150891d01ab69b45cabda0c8dc4f9353fd69c804f4ea9e1dbe350002188fb7-merged.mount: Deactivated successfully.
Dec 01 09:12:23 compute-0 podman[75032]: 2025-12-01 09:12:23.478761973 +0000 UTC m=+0.612522337 container remove 8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744 (image=quay.io/ceph/ceph:v18, name=musing_haslett, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:12:23 compute-0 systemd[1]: libpod-conmon-8cd0c44f0f4f3aef750be6be0fd2554644b006d303fa2a50cfb7776b5ed09744.scope: Deactivated successfully.
Dec 01 09:12:23 compute-0 podman[75124]: 2025-12-01 09:12:23.531702198 +0000 UTC m=+0.036581961 container create a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:12:23 compute-0 systemd[1]: Started libpod-conmon-a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8.scope.
Dec 01 09:12:23 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:23 compute-0 podman[75124]: 2025-12-01 09:12:23.594898815 +0000 UTC m=+0.099778638 container init a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:12:23 compute-0 podman[75124]: 2025-12-01 09:12:23.601010409 +0000 UTC m=+0.105890162 container start a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:23 compute-0 podman[75124]: 2025-12-01 09:12:23.604125277 +0000 UTC m=+0.109005050 container attach a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:23 compute-0 podman[75124]: 2025-12-01 09:12:23.51630547 +0000 UTC m=+0.021185243 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=cluster_network}] v 0) v1
Dec 01 09:12:24 compute-0 systemd[1]: libpod-a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8.scope: Deactivated successfully.
Dec 01 09:12:24 compute-0 podman[75124]: 2025-12-01 09:12:24.005195991 +0000 UTC m=+0.510075744 container died a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca66f84caa9593e9a3ba3d861c10f5d0b5cec7885cd1206298467e83650d6127-merged.mount: Deactivated successfully.
Dec 01 09:12:24 compute-0 podman[75124]: 2025-12-01 09:12:24.212196907 +0000 UTC m=+0.717076660 container remove a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8 (image=quay.io/ceph/ceph:v18, name=beautiful_pare, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:12:24 compute-0 systemd[1]: libpod-conmon-a4dc3855eb7b1429c918e310689980e516e0e7220d4a3dfe8022321d7faaf5a8.scope: Deactivated successfully.
Dec 01 09:12:24 compute-0 systemd[1]: Reloading.
Dec 01 09:12:24 compute-0 systemd-sysv-generator[75210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:12:24 compute-0 systemd-rc-local-generator[75205]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:12:24 compute-0 systemd[1]: Reloading.
Dec 01 09:12:24 compute-0 systemd-sysv-generator[75250]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:12:24 compute-0 systemd-rc-local-generator[75246]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:12:24 compute-0 systemd[1]: Starting Ceph mgr.compute-0.psduho for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:12:24 compute-0 podman[75304]: 2025-12-01 09:12:24.987027417 +0000 UTC m=+0.042793758 container create d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:12:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77ef07e55d6a56d24c49c75a663dce65c3f6b4cb048e39f91ec5bc9931a540e4/merged/var/lib/ceph/mgr/ceph-compute-0.psduho supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:25 compute-0 podman[75304]: 2025-12-01 09:12:25.038264794 +0000 UTC m=+0.094031155 container init d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:12:25 compute-0 podman[75304]: 2025-12-01 09:12:25.042774642 +0000 UTC m=+0.098540983 container start d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 01 09:12:25 compute-0 bash[75304]: d04e39f9595930c329593b0175fd79c3eaca1ee860ab21f8e9c224dfe8abed9b
Dec 01 09:12:25 compute-0 podman[75304]: 2025-12-01 09:12:24.968465559 +0000 UTC m=+0.024231920 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:25 compute-0 systemd[1]: Started Ceph mgr.compute-0.psduho for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: pidfile_write: ignore empty --pid-file
Dec 01 09:12:25 compute-0 podman[75325]: 2025-12-01 09:12:25.122218411 +0000 UTC m=+0.042566432 container create 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:25 compute-0 systemd[1]: Started libpod-conmon-123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559.scope.
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'alerts'
Dec 01 09:12:25 compute-0 podman[75325]: 2025-12-01 09:12:25.103726165 +0000 UTC m=+0.024074196 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:25 compute-0 podman[75325]: 2025-12-01 09:12:25.223941243 +0000 UTC m=+0.144289254 container init 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:12:25 compute-0 podman[75325]: 2025-12-01 09:12:25.230617753 +0000 UTC m=+0.150965764 container start 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:25 compute-0 podman[75325]: 2025-12-01 09:12:25.248713597 +0000 UTC m=+0.169061618 container attach 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'balancer'
Dec 01 09:12:25 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:25.528+0000 7f54427be140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:12:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1543184186' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:25 compute-0 festive_sutherland[75366]: 
Dec 01 09:12:25 compute-0 festive_sutherland[75366]: {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "health": {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "status": "HEALTH_OK",
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "checks": {},
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "mutes": []
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     },
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "election_epoch": 5,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "quorum": [
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         0
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     ],
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "quorum_names": [
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "compute-0"
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     ],
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "quorum_age": 2,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "monmap": {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "epoch": 1,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "min_mon_release_name": "reef",
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_mons": 1
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     },
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "osdmap": {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "epoch": 1,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_osds": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_up_osds": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "osd_up_since": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_in_osds": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "osd_in_since": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_remapped_pgs": 0
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     },
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "pgmap": {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "pgs_by_state": [],
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_pgs": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_pools": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_objects": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "data_bytes": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "bytes_used": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "bytes_avail": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "bytes_total": 0
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     },
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "fsmap": {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "epoch": 1,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "by_rank": [],
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "up:standby": 0
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     },
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "mgrmap": {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "available": false,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "num_standbys": 0,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "modules": [
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:             "iostat",
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:             "nfs",
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:             "restful"
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         ],
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "services": {}
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     },
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "servicemap": {
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "epoch": 1,
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:         "services": {}
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     },
Dec 01 09:12:25 compute-0 festive_sutherland[75366]:     "progress_events": {}
Dec 01 09:12:25 compute-0 festive_sutherland[75366]: }
Dec 01 09:12:25 compute-0 systemd[1]: libpod-123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559.scope: Deactivated successfully.
Dec 01 09:12:25 compute-0 podman[75325]: 2025-12-01 09:12:25.654447564 +0000 UTC m=+0.574795565 container died 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:25 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:25.793+0000 7f54427be140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:12:25 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'cephadm'
Dec 01 09:12:26 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1543184186' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-338005094f1dbd35386ef3f1269c25328b228dbec05459ea31aa5f86da4f58c0-merged.mount: Deactivated successfully.
Dec 01 09:12:26 compute-0 podman[75325]: 2025-12-01 09:12:26.065142121 +0000 UTC m=+0.985490132 container remove 123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559 (image=quay.io/ceph/ceph:v18, name=festive_sutherland, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec 01 09:12:26 compute-0 systemd[1]: libpod-conmon-123b93bc482e5e5f84b9bea29975ef6ea3bcafc0428270c6f560045022074559.scope: Deactivated successfully.
Dec 01 09:12:26 compute-0 sshd-session[75403]: Connection closed by 101.36.224.146 port 33166
Dec 01 09:12:27 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'crash'
Dec 01 09:12:28 compute-0 podman[75416]: 2025-12-01 09:12:28.124541186 +0000 UTC m=+0.037374214 container create 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:28 compute-0 systemd[1]: Started libpod-conmon-7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192.scope.
Dec 01 09:12:28 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:28 compute-0 ceph-mgr[75324]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:12:28 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:28.175+0000 7f54427be140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:12:28 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'dashboard'
Dec 01 09:12:28 compute-0 podman[75416]: 2025-12-01 09:12:28.190651296 +0000 UTC m=+0.103484324 container init 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:12:28 compute-0 podman[75416]: 2025-12-01 09:12:28.198492379 +0000 UTC m=+0.111325407 container start 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec 01 09:12:28 compute-0 podman[75416]: 2025-12-01 09:12:28.202348678 +0000 UTC m=+0.115181726 container attach 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:28 compute-0 podman[75416]: 2025-12-01 09:12:28.107060849 +0000 UTC m=+0.019893897 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4027611761' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]: 
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]: {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "health": {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "status": "HEALTH_OK",
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "checks": {},
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "mutes": []
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     },
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "election_epoch": 5,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "quorum": [
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         0
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     ],
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "quorum_names": [
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "compute-0"
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     ],
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "quorum_age": 5,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "monmap": {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "epoch": 1,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "min_mon_release_name": "reef",
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_mons": 1
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     },
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "osdmap": {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "epoch": 1,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_osds": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_up_osds": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "osd_up_since": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_in_osds": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "osd_in_since": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_remapped_pgs": 0
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     },
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "pgmap": {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "pgs_by_state": [],
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_pgs": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_pools": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_objects": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "data_bytes": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "bytes_used": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "bytes_avail": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "bytes_total": 0
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     },
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "fsmap": {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "epoch": 1,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "by_rank": [],
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "up:standby": 0
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     },
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "mgrmap": {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "available": false,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "num_standbys": 0,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "modules": [
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:             "iostat",
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:             "nfs",
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:             "restful"
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         ],
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "services": {}
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     },
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "servicemap": {
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "epoch": 1,
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:         "services": {}
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     },
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]:     "progress_events": {}
Dec 01 09:12:28 compute-0 peaceful_tharp[75432]: }
Dec 01 09:12:28 compute-0 systemd[1]: libpod-7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192.scope: Deactivated successfully.
Dec 01 09:12:28 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4027611761' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:28 compute-0 podman[75459]: 2025-12-01 09:12:28.635769802 +0000 UTC m=+0.023879700 container died 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:12:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-26e5b27a33154eac81b93ae6cc3dfac7311b4ee17aa1d5afe6229450499899f4-merged.mount: Deactivated successfully.
Dec 01 09:12:28 compute-0 podman[75459]: 2025-12-01 09:12:28.732018339 +0000 UTC m=+0.120128217 container remove 7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192 (image=quay.io/ceph/ceph:v18, name=peaceful_tharp, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:12:28 compute-0 systemd[1]: libpod-conmon-7ac24e9bb8a7eb2f6badced51a33658599a88d785c576bb0035a9b0b8e1e3192.scope: Deactivated successfully.
Dec 01 09:12:29 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'devicehealth'
Dec 01 09:12:29 compute-0 ceph-mgr[75324]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:12:29 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:29.987+0000 7f54427be140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:12:29 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'diskprediction_local'
Dec 01 09:12:30 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 01 09:12:30 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 01 09:12:30 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]:   from numpy import show_config as show_numpy_config
Dec 01 09:12:30 compute-0 ceph-mgr[75324]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:12:30 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:30.603+0000 7f54427be140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:12:30 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'influx'
Dec 01 09:12:30 compute-0 podman[75473]: 2025-12-01 09:12:30.849450813 +0000 UTC m=+0.090484374 container create 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:12:30 compute-0 podman[75473]: 2025-12-01 09:12:30.783749175 +0000 UTC m=+0.024782776 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:30 compute-0 ceph-mgr[75324]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:12:30 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:30.884+0000 7f54427be140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:12:30 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'insights'
Dec 01 09:12:30 compute-0 systemd[1]: Started libpod-conmon-3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66.scope.
Dec 01 09:12:30 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:30 compute-0 podman[75473]: 2025-12-01 09:12:30.936732864 +0000 UTC m=+0.177766425 container init 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:12:30 compute-0 podman[75473]: 2025-12-01 09:12:30.941284604 +0000 UTC m=+0.182318145 container start 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:30 compute-0 podman[75473]: 2025-12-01 09:12:30.944782573 +0000 UTC m=+0.185816124 container attach 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:31 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'iostat'
Dec 01 09:12:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3742582137' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:31 compute-0 gallant_wing[75489]: 
Dec 01 09:12:31 compute-0 gallant_wing[75489]: {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "health": {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "status": "HEALTH_OK",
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "checks": {},
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "mutes": []
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     },
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "election_epoch": 5,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "quorum": [
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         0
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     ],
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "quorum_names": [
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "compute-0"
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     ],
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "quorum_age": 8,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "monmap": {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "epoch": 1,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "min_mon_release_name": "reef",
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_mons": 1
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     },
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "osdmap": {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "epoch": 1,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_osds": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_up_osds": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "osd_up_since": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_in_osds": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "osd_in_since": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_remapped_pgs": 0
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     },
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "pgmap": {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "pgs_by_state": [],
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_pgs": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_pools": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_objects": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "data_bytes": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "bytes_used": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "bytes_avail": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "bytes_total": 0
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     },
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "fsmap": {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "epoch": 1,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "by_rank": [],
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "up:standby": 0
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     },
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "mgrmap": {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "available": false,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "num_standbys": 0,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "modules": [
Dec 01 09:12:31 compute-0 gallant_wing[75489]:             "iostat",
Dec 01 09:12:31 compute-0 gallant_wing[75489]:             "nfs",
Dec 01 09:12:31 compute-0 gallant_wing[75489]:             "restful"
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         ],
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "services": {}
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     },
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "servicemap": {
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "epoch": 1,
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:31 compute-0 gallant_wing[75489]:         "services": {}
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     },
Dec 01 09:12:31 compute-0 gallant_wing[75489]:     "progress_events": {}
Dec 01 09:12:31 compute-0 gallant_wing[75489]: }
Dec 01 09:12:31 compute-0 systemd[1]: libpod-3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66.scope: Deactivated successfully.
Dec 01 09:12:31 compute-0 podman[75473]: 2025-12-01 09:12:31.361751709 +0000 UTC m=+0.602785270 container died 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:12:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3742582137' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-6648b0356c9786bed952e3c7ab67904cf4c27f0d2b7c4d6f02f82294e3f23f2e-merged.mount: Deactivated successfully.
Dec 01 09:12:31 compute-0 ceph-mgr[75324]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:12:31 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'k8sevents'
Dec 01 09:12:31 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:31.413+0000 7f54427be140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:12:31 compute-0 podman[75473]: 2025-12-01 09:12:31.465852629 +0000 UTC m=+0.706886180 container remove 3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66 (image=quay.io/ceph/ceph:v18, name=gallant_wing, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:31 compute-0 systemd[1]: libpod-conmon-3f71b260e2ed30e5b4e9c020d8b6813e79a93f5924e63b71745ff1592cdf9c66.scope: Deactivated successfully.
Dec 01 09:12:33 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'localpool'
Dec 01 09:12:33 compute-0 podman[75527]: 2025-12-01 09:12:33.523058951 +0000 UTC m=+0.036034836 container create 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:12:33 compute-0 systemd[1]: Started libpod-conmon-4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d.scope.
Dec 01 09:12:33 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:33 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'mds_autoscaler'
Dec 01 09:12:33 compute-0 podman[75527]: 2025-12-01 09:12:33.507831798 +0000 UTC m=+0.020807703 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:33 compute-0 podman[75527]: 2025-12-01 09:12:33.61551737 +0000 UTC m=+0.128493275 container init 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:12:33 compute-0 podman[75527]: 2025-12-01 09:12:33.620250204 +0000 UTC m=+0.133226089 container start 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:12:33 compute-0 podman[75527]: 2025-12-01 09:12:33.623484156 +0000 UTC m=+0.136460041 container attach 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:12:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/960206289' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]: 
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]: {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "health": {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "status": "HEALTH_OK",
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "checks": {},
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "mutes": []
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     },
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "election_epoch": 5,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "quorum": [
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         0
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     ],
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "quorum_names": [
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "compute-0"
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     ],
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "quorum_age": 11,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "monmap": {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "epoch": 1,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "min_mon_release_name": "reef",
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_mons": 1
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     },
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "osdmap": {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "epoch": 1,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_osds": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_up_osds": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "osd_up_since": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_in_osds": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "osd_in_since": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_remapped_pgs": 0
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     },
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "pgmap": {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "pgs_by_state": [],
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_pgs": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_pools": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_objects": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "data_bytes": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "bytes_used": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "bytes_avail": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "bytes_total": 0
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     },
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "fsmap": {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "epoch": 1,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "by_rank": [],
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "up:standby": 0
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     },
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "mgrmap": {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "available": false,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "num_standbys": 0,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "modules": [
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:             "iostat",
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:             "nfs",
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:             "restful"
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         ],
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "services": {}
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     },
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "servicemap": {
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "epoch": 1,
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:         "services": {}
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     },
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]:     "progress_events": {}
Dec 01 09:12:34 compute-0 vibrant_rosalind[75544]: }
Dec 01 09:12:34 compute-0 systemd[1]: libpod-4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d.scope: Deactivated successfully.
Dec 01 09:12:34 compute-0 podman[75527]: 2025-12-01 09:12:34.044509598 +0000 UTC m=+0.557485483 container died 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-edd2dbd1264b0a0daca534acc396b9653e0248eda93f1ff39cc7399d9207f0a8-merged.mount: Deactivated successfully.
Dec 01 09:12:34 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/960206289' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:34 compute-0 podman[75527]: 2025-12-01 09:12:34.086505462 +0000 UTC m=+0.599481347 container remove 4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d (image=quay.io/ceph/ceph:v18, name=vibrant_rosalind, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:12:34 compute-0 systemd[1]: libpod-conmon-4bb5294eda0c3a4f30a8b95381f1346c825e2938c83b7c37cfabc90350df6b8d.scope: Deactivated successfully.
Dec 01 09:12:34 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'mirroring'
Dec 01 09:12:34 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'nfs'
Dec 01 09:12:35 compute-0 ceph-mgr[75324]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:12:35 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:35.409+0000 7f54427be140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:12:35 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'orchestrator'
Dec 01 09:12:36 compute-0 ceph-mgr[75324]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:12:36 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:36.132+0000 7f54427be140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:12:36 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'osd_perf_query'
Dec 01 09:12:36 compute-0 podman[75583]: 2025-12-01 09:12:36.149680295 +0000 UTC m=+0.042124059 container create a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:12:36 compute-0 systemd[1]: Started libpod-conmon-a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170.scope.
Dec 01 09:12:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:36 compute-0 podman[75583]: 2025-12-01 09:12:36.207219121 +0000 UTC m=+0.099662905 container init a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:12:36 compute-0 podman[75583]: 2025-12-01 09:12:36.213046346 +0000 UTC m=+0.105490110 container start a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:36 compute-0 podman[75583]: 2025-12-01 09:12:36.216020881 +0000 UTC m=+0.108464695 container attach a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:12:36 compute-0 podman[75583]: 2025-12-01 09:12:36.130050106 +0000 UTC m=+0.022493870 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:36 compute-0 ceph-mgr[75324]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:12:36 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'osd_support'
Dec 01 09:12:36 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:36.424+0000 7f54427be140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:12:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4191684427' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]: 
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]: {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "health": {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "status": "HEALTH_OK",
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "checks": {},
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "mutes": []
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     },
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "election_epoch": 5,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "quorum": [
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         0
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     ],
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "quorum_names": [
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "compute-0"
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     ],
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "quorum_age": 13,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "monmap": {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "epoch": 1,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "min_mon_release_name": "reef",
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_mons": 1
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     },
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "osdmap": {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "epoch": 1,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_osds": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_up_osds": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "osd_up_since": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_in_osds": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "osd_in_since": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_remapped_pgs": 0
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     },
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "pgmap": {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "pgs_by_state": [],
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_pgs": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_pools": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_objects": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "data_bytes": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "bytes_used": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "bytes_avail": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "bytes_total": 0
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     },
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "fsmap": {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "epoch": 1,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "by_rank": [],
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "up:standby": 0
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     },
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "mgrmap": {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "available": false,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "num_standbys": 0,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "modules": [
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:             "iostat",
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:             "nfs",
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:             "restful"
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         ],
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "services": {}
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     },
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "servicemap": {
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "epoch": 1,
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:         "services": {}
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     },
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]:     "progress_events": {}
Dec 01 09:12:36 compute-0 charming_elbakyan[75599]: }
Dec 01 09:12:36 compute-0 systemd[1]: libpod-a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170.scope: Deactivated successfully.
Dec 01 09:12:36 compute-0 podman[75583]: 2025-12-01 09:12:36.623202688 +0000 UTC m=+0.515646462 container died a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Dec 01 09:12:36 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4191684427' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-f336aba3ff6ab595c6e66e85c5dbe7758c9dce32dfb6f9e6990178971a118afe-merged.mount: Deactivated successfully.
Dec 01 09:12:36 compute-0 ceph-mgr[75324]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:12:36 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'pg_autoscaler'
Dec 01 09:12:36 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:36.697+0000 7f54427be140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:12:36 compute-0 podman[75583]: 2025-12-01 09:12:36.713678511 +0000 UTC m=+0.606122285 container remove a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170 (image=quay.io/ceph/ceph:v18, name=charming_elbakyan, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:12:36 compute-0 systemd[1]: libpod-conmon-a3ff55b047f6157c6460050a5005133d8fb9b9760825846293fed250481dd170.scope: Deactivated successfully.
Dec 01 09:12:37 compute-0 ceph-mgr[75324]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:12:37 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'progress'
Dec 01 09:12:37 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:37.007+0000 7f54427be140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:12:37 compute-0 ceph-mgr[75324]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:12:37 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'prometheus'
Dec 01 09:12:37 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:37.271+0000 7f54427be140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:12:38 compute-0 ceph-mgr[75324]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:12:38 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:38.350+0000 7f54427be140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:12:38 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'rbd_support'
Dec 01 09:12:38 compute-0 ceph-mgr[75324]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:12:38 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'restful'
Dec 01 09:12:38 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:38.691+0000 7f54427be140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:12:38 compute-0 podman[75639]: 2025-12-01 09:12:38.81720805 +0000 UTC m=+0.076306300 container create 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:12:38 compute-0 systemd[1]: Started libpod-conmon-679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045.scope.
Dec 01 09:12:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:38 compute-0 podman[75639]: 2025-12-01 09:12:38.770987766 +0000 UTC m=+0.030086036 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:38 compute-0 podman[75639]: 2025-12-01 09:12:38.882423025 +0000 UTC m=+0.141521295 container init 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:38 compute-0 podman[75639]: 2025-12-01 09:12:38.887479768 +0000 UTC m=+0.146578018 container start 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Dec 01 09:12:38 compute-0 podman[75639]: 2025-12-01 09:12:38.890664759 +0000 UTC m=+0.149763029 container attach 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:39 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2405312232' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]: 
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]: {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "health": {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "status": "HEALTH_OK",
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "checks": {},
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "mutes": []
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     },
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "election_epoch": 5,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "quorum": [
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         0
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     ],
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "quorum_names": [
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "compute-0"
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     ],
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "quorum_age": 16,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "monmap": {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "epoch": 1,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "min_mon_release_name": "reef",
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_mons": 1
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     },
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "osdmap": {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "epoch": 1,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_osds": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_up_osds": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "osd_up_since": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_in_osds": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "osd_in_since": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_remapped_pgs": 0
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     },
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "pgmap": {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "pgs_by_state": [],
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_pgs": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_pools": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_objects": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "data_bytes": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "bytes_used": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "bytes_avail": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "bytes_total": 0
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     },
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "fsmap": {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "epoch": 1,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "by_rank": [],
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "up:standby": 0
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     },
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "mgrmap": {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "available": false,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "num_standbys": 0,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "modules": [
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:             "iostat",
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:             "nfs",
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:             "restful"
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         ],
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "services": {}
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     },
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "servicemap": {
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "epoch": 1,
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:         "services": {}
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     },
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]:     "progress_events": {}
Dec 01 09:12:39 compute-0 ecstatic_wu[75655]: }
Dec 01 09:12:39 compute-0 systemd[1]: libpod-679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045.scope: Deactivated successfully.
Dec 01 09:12:39 compute-0 podman[75681]: 2025-12-01 09:12:39.359213901 +0000 UTC m=+0.023289393 container died 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:39 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2405312232' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-1f5af3dd0bb1be36ce30127e19f51b77b5caa37cc0fa73d679faa47789ad1c34-merged.mount: Deactivated successfully.
Dec 01 09:12:39 compute-0 podman[75681]: 2025-12-01 09:12:39.407935097 +0000 UTC m=+0.072010569 container remove 679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045 (image=quay.io/ceph/ceph:v18, name=ecstatic_wu, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:12:39 compute-0 systemd[1]: libpod-conmon-679353b1335e6229f6ad1b8e55dc5f6d900b3012c3ecaf306200e9ee71968045.scope: Deactivated successfully.
Dec 01 09:12:39 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'rgw'
Dec 01 09:12:40 compute-0 ceph-mgr[75324]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:12:40 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'rook'
Dec 01 09:12:40 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:40.209+0000 7f54427be140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:12:41 compute-0 podman[75697]: 2025-12-01 09:12:41.471945112 +0000 UTC m=+0.038529566 container create f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:12:41 compute-0 systemd[1]: Started libpod-conmon-f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631.scope.
Dec 01 09:12:41 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:41 compute-0 podman[75697]: 2025-12-01 09:12:41.455706331 +0000 UTC m=+0.022290815 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:41 compute-0 podman[75697]: 2025-12-01 09:12:41.562085505 +0000 UTC m=+0.128669999 container init f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:41 compute-0 podman[75697]: 2025-12-01 09:12:41.567699565 +0000 UTC m=+0.134284029 container start f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Dec 01 09:12:41 compute-0 podman[75697]: 2025-12-01 09:12:41.570998549 +0000 UTC m=+0.137583053 container attach f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:12:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:41 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1192011265' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]: 
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]: {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "health": {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "status": "HEALTH_OK",
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "checks": {},
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "mutes": []
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     },
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "election_epoch": 5,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "quorum": [
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         0
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     ],
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "quorum_names": [
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "compute-0"
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     ],
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "quorum_age": 19,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "monmap": {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "epoch": 1,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "min_mon_release_name": "reef",
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_mons": 1
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     },
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "osdmap": {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "epoch": 1,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_osds": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_up_osds": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "osd_up_since": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_in_osds": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "osd_in_since": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_remapped_pgs": 0
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     },
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "pgmap": {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "pgs_by_state": [],
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_pgs": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_pools": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_objects": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "data_bytes": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "bytes_used": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "bytes_avail": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "bytes_total": 0
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     },
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "fsmap": {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "epoch": 1,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "by_rank": [],
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "up:standby": 0
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     },
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "mgrmap": {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "available": false,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "num_standbys": 0,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "modules": [
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:             "iostat",
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:             "nfs",
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:             "restful"
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         ],
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "services": {}
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     },
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "servicemap": {
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "epoch": 1,
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:         "services": {}
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     },
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]:     "progress_events": {}
Dec 01 09:12:41 compute-0 optimistic_ellis[75713]: }
Dec 01 09:12:41 compute-0 systemd[1]: libpod-f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631.scope: Deactivated successfully.
Dec 01 09:12:41 compute-0 podman[75697]: 2025-12-01 09:12:41.989011044 +0000 UTC m=+0.555595508 container died f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-9ba195f68fc936af12720c541d59f1025ce9c83c7321599a69c053913c2b9bc0-merged.mount: Deactivated successfully.
Dec 01 09:12:42 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1192011265' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:42 compute-0 podman[75697]: 2025-12-01 09:12:42.029267959 +0000 UTC m=+0.595852423 container remove f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631 (image=quay.io/ceph/ceph:v18, name=optimistic_ellis, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:42 compute-0 systemd[1]: libpod-conmon-f80f000b082851730418cb00d7cb87282d56b6b50829272a3b32c3be2565d631.scope: Deactivated successfully.
Dec 01 09:12:42 compute-0 ceph-mgr[75324]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:12:42 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'selftest'
Dec 01 09:12:42 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:42.480+0000 7f54427be140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:12:42 compute-0 ceph-mgr[75324]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:12:42 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'snap_schedule'
Dec 01 09:12:42 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:42.745+0000 7f54427be140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:12:43 compute-0 ceph-mgr[75324]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:12:43 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'stats'
Dec 01 09:12:43 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:43.017+0000 7f54427be140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:12:43 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'status'
Dec 01 09:12:43 compute-0 ceph-mgr[75324]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:12:43 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'telegraf'
Dec 01 09:12:43 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:43.585+0000 7f54427be140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:12:43 compute-0 ceph-mgr[75324]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:12:43 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'telemetry'
Dec 01 09:12:43 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:43.835+0000 7f54427be140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:12:44 compute-0 podman[75755]: 2025-12-01 09:12:44.072610428 +0000 UTC m=+0.021644757 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:44 compute-0 ceph-mgr[75324]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:12:44 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'test_orchestrator'
Dec 01 09:12:44 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:44.523+0000 7f54427be140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:12:45 compute-0 ceph-mgr[75324]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:12:45 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:45.269+0000 7f54427be140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:12:45 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'volumes'
Dec 01 09:12:45 compute-0 podman[75755]: 2025-12-01 09:12:45.706565766 +0000 UTC m=+1.655600065 container create 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:12:45 compute-0 systemd[1]: Started libpod-conmon-7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4.scope.
Dec 01 09:12:45 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:45 compute-0 podman[75755]: 2025-12-01 09:12:45.830377476 +0000 UTC m=+1.779411795 container init 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:12:45 compute-0 podman[75755]: 2025-12-01 09:12:45.83756796 +0000 UTC m=+1.786602259 container start 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:12:45 compute-0 podman[75755]: 2025-12-01 09:12:45.841021029 +0000 UTC m=+1.790055358 container attach 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'zabbix'
Dec 01 09:12:46 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:46.125+0000 7f54427be140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3735729918' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:46 compute-0 jovial_lamport[75771]: 
Dec 01 09:12:46 compute-0 jovial_lamport[75771]: {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "health": {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "status": "HEALTH_OK",
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "checks": {},
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "mutes": []
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     },
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "election_epoch": 5,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "quorum": [
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         0
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     ],
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "quorum_names": [
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "compute-0"
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     ],
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "quorum_age": 23,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "monmap": {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "epoch": 1,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "min_mon_release_name": "reef",
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_mons": 1
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     },
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "osdmap": {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "epoch": 1,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_osds": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_up_osds": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "osd_up_since": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_in_osds": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "osd_in_since": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_remapped_pgs": 0
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     },
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "pgmap": {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "pgs_by_state": [],
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_pgs": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_pools": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_objects": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "data_bytes": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "bytes_used": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "bytes_avail": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "bytes_total": 0
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     },
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "fsmap": {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "epoch": 1,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "by_rank": [],
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "up:standby": 0
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     },
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "mgrmap": {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "available": false,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "num_standbys": 0,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "modules": [
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:             "iostat",
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:             "nfs",
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:             "restful"
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         ],
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "services": {}
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     },
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "servicemap": {
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "epoch": 1,
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:         "services": {}
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     },
Dec 01 09:12:46 compute-0 jovial_lamport[75771]:     "progress_events": {}
Dec 01 09:12:46 compute-0 jovial_lamport[75771]: }
Dec 01 09:12:46 compute-0 systemd[1]: libpod-7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4.scope: Deactivated successfully.
Dec 01 09:12:46 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3735729918' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:46 compute-0 podman[75797]: 2025-12-01 09:12:46.388262458 +0000 UTC m=+0.053830631 container died 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec 01 09:12:46 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:46.405+0000 7f54427be140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: ms_deliver_dispatch: unhandled message 0x55f9a53f31e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Dec 01 09:12:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-7c006491637e3f311018b7732dd558850085fa1f9a3c32149b01a12087fce55d-merged.mount: Deactivated successfully.
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Activating manager daemon compute-0.psduho
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr handle_mgr_map Activating!
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr handle_mgr_map I am now activating
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e2: compute-0.psduho(active, starting, since 0.00914665s)
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e1 all = 1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"}]: dispatch
Dec 01 09:12:46 compute-0 podman[75797]: 2025-12-01 09:12:46.429070199 +0000 UTC m=+0.094638372 container remove 7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4 (image=quay.io/ceph/ceph:v18, name=jovial_lamport, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: balancer
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [balancer INFO root] Starting
Dec 01 09:12:46 compute-0 systemd[1]: libpod-conmon-7db8ae583503c52c3ccd546573f9a8f3e0e8dc53507bf3f13554b401770c8ff4.scope: Deactivated successfully.
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: crash
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Manager daemon compute-0.psduho is now available
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: devicehealth
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [devicehealth INFO root] Starting
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: iostat
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: nfs
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: orchestrator
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:12:46
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [balancer INFO root] No pools available
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: pg_autoscaler
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: progress
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [progress INFO root] Loading...
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [progress INFO root] No stored events to load
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [progress INFO root] Loaded [] historic events
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [progress INFO root] Loaded OSDMap, ready.
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] recovery thread starting
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] starting setup
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: rbd_support
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: restful
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [restful INFO root] server_addr: :: server_port: 8003
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [restful WARNING root] server not running: no certificate configured
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: status
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: telemetry
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/report_id}] v 0) v1
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] PerfHandler: starting
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TaskHandler: starting
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"} v 0) v1
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/salt}] v 0) v1
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: [rbd_support INFO root] setup complete
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec 01 09:12:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/collection}] v 0) v1
Dec 01 09:12:46 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: volumes
Dec 01 09:12:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec 01 09:12:47 compute-0 ceph-mon[75031]: Activating manager daemon compute-0.psduho
Dec 01 09:12:47 compute-0 ceph-mon[75031]: mgrmap e2: compute-0.psduho(active, starting, since 0.00914665s)
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"}]: dispatch
Dec 01 09:12:47 compute-0 ceph-mon[75031]: Manager daemon compute-0.psduho is now available
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec 01 09:12:47 compute-0 ceph-mon[75031]: from='mgr.14102 192.168.122.100:0/3463197855' entity='mgr.compute-0.psduho' 
Dec 01 09:12:47 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e3: compute-0.psduho(active, since 1.20235s)
Dec 01 09:12:48 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:12:48 compute-0 podman[75892]: 2025-12-01 09:12:48.504668414 +0000 UTC m=+0.047227924 container create ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:12:48 compute-0 systemd[1]: Started libpod-conmon-ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2.scope.
Dec 01 09:12:48 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:48 compute-0 podman[75892]: 2025-12-01 09:12:48.574004466 +0000 UTC m=+0.116563976 container init ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:12:48 compute-0 podman[75892]: 2025-12-01 09:12:48.480786135 +0000 UTC m=+0.023345675 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:48 compute-0 podman[75892]: 2025-12-01 09:12:48.584630258 +0000 UTC m=+0.127189788 container start ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 09:12:48 compute-0 podman[75892]: 2025-12-01 09:12:48.588700563 +0000 UTC m=+0.131260113 container attach ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:12:48 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e4: compute-0.psduho(active, since 2s)
Dec 01 09:12:48 compute-0 ceph-mon[75031]: mgrmap e3: compute-0.psduho(active, since 1.20235s)
Dec 01 09:12:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:12:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2366778493' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:49 compute-0 exciting_faraday[75908]: 
Dec 01 09:12:49 compute-0 exciting_faraday[75908]: {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "health": {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "status": "HEALTH_OK",
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "checks": {},
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "mutes": []
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     },
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "election_epoch": 5,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "quorum": [
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         0
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     ],
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "quorum_names": [
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "compute-0"
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     ],
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "quorum_age": 26,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "monmap": {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "epoch": 1,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "min_mon_release_name": "reef",
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_mons": 1
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     },
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "osdmap": {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "epoch": 1,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_osds": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_up_osds": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "osd_up_since": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_in_osds": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "osd_in_since": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_remapped_pgs": 0
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     },
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "pgmap": {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "pgs_by_state": [],
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_pgs": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_pools": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_objects": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "data_bytes": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "bytes_used": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "bytes_avail": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "bytes_total": 0
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     },
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "fsmap": {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "epoch": 1,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "by_rank": [],
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "up:standby": 0
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     },
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "mgrmap": {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "available": true,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "num_standbys": 0,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "modules": [
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:             "iostat",
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:             "nfs",
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:             "restful"
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         ],
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "services": {}
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     },
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "servicemap": {
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "epoch": 1,
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "modified": "2025-12-01T09:12:20.101670+0000",
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:         "services": {}
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     },
Dec 01 09:12:49 compute-0 exciting_faraday[75908]:     "progress_events": {}
Dec 01 09:12:49 compute-0 exciting_faraday[75908]: }
Dec 01 09:12:49 compute-0 systemd[1]: libpod-ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2.scope: Deactivated successfully.
Dec 01 09:12:49 compute-0 podman[75892]: 2025-12-01 09:12:49.190537766 +0000 UTC m=+0.733097266 container died ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:12:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-ef30f1925195992ef33c598d9f0ccf4d7f97c9de892f94d3d7741733a691b8c6-merged.mount: Deactivated successfully.
Dec 01 09:12:49 compute-0 podman[75892]: 2025-12-01 09:12:49.280896895 +0000 UTC m=+0.823456405 container remove ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2 (image=quay.io/ceph/ceph:v18, name=exciting_faraday, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:12:49 compute-0 systemd[1]: libpod-conmon-ee6b30aad0b0a1746fd2349c4b437804430744e2a05d97c358cf5abf2a4c49d2.scope: Deactivated successfully.
Dec 01 09:12:49 compute-0 podman[75946]: 2025-12-01 09:12:49.347632942 +0000 UTC m=+0.041502301 container create 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:12:49 compute-0 systemd[1]: Started libpod-conmon-4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2.scope.
Dec 01 09:12:49 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471/merged/var/lib/ceph/user.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:49 compute-0 podman[75946]: 2025-12-01 09:12:49.413095774 +0000 UTC m=+0.106965143 container init 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:49 compute-0 podman[75946]: 2025-12-01 09:12:49.420233407 +0000 UTC m=+0.114102766 container start 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:49 compute-0 podman[75946]: 2025-12-01 09:12:49.42386262 +0000 UTC m=+0.117731979 container attach 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:12:49 compute-0 podman[75946]: 2025-12-01 09:12:49.329764224 +0000 UTC m=+0.023633593 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:50 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:12:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config assimilate-conf"} v 0) v1
Dec 01 09:12:50 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1715548196' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec 01 09:12:50 compute-0 systemd[1]: libpod-4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2.scope: Deactivated successfully.
Dec 01 09:12:50 compute-0 podman[75946]: 2025-12-01 09:12:50.586633781 +0000 UTC m=+1.280503140 container died 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:12:50 compute-0 ceph-mon[75031]: mgrmap e4: compute-0.psduho(active, since 2s)
Dec 01 09:12:50 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2366778493' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:12:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-e5a6b58ab70cd89a09667bf796c174534b2436d214e6e38a0d0edb79b9ece471-merged.mount: Deactivated successfully.
Dec 01 09:12:50 compute-0 podman[75946]: 2025-12-01 09:12:50.907570577 +0000 UTC m=+1.601439936 container remove 4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2 (image=quay.io/ceph/ceph:v18, name=crazy_dijkstra, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:12:50 compute-0 systemd[1]: libpod-conmon-4ce6e0b4f816ff14189e33dcdf71a422b44396341ee6f80b877ecd920ef584d2.scope: Deactivated successfully.
Dec 01 09:12:51 compute-0 podman[76006]: 2025-12-01 09:12:50.953871813 +0000 UTC m=+0.025317311 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:51 compute-0 podman[76006]: 2025-12-01 09:12:51.072480306 +0000 UTC m=+0.143925784 container create 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:51 compute-0 systemd[1]: Started libpod-conmon-8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864.scope.
Dec 01 09:12:51 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:51 compute-0 podman[76006]: 2025-12-01 09:12:51.160478188 +0000 UTC m=+0.231923696 container init 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:51 compute-0 podman[76006]: 2025-12-01 09:12:51.172381346 +0000 UTC m=+0.243826824 container start 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:12:51 compute-0 podman[76006]: 2025-12-01 09:12:51.176213485 +0000 UTC m=+0.247659013 container attach 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:51 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1
Dec 01 09:12:51 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch
Dec 01 09:12:51 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1715548196' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec 01 09:12:51 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch
Dec 01 09:12:51 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished
Dec 01 09:12:51 compute-0 ceph-mgr[75324]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 01 09:12:51 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e5: compute-0.psduho(active, since 5s)
Dec 01 09:12:51 compute-0 systemd[1]: libpod-8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864.scope: Deactivated successfully.
Dec 01 09:12:51 compute-0 podman[76006]: 2025-12-01 09:12:51.84711054 +0000 UTC m=+0.918556008 container died 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:12:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-eb09c81a4e4c0f6e89c4aab9bb9e178eb2b1e96581aa46cf1d20bd3dc8d004ca-merged.mount: Deactivated successfully.
Dec 01 09:12:51 compute-0 podman[76006]: 2025-12-01 09:12:51.888734883 +0000 UTC m=+0.960180351 container remove 8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864 (image=quay.io/ceph/ceph:v18, name=practical_pare, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:12:51 compute-0 systemd[1]: libpod-conmon-8b5393917ceccb9fc93f8b97b03eed230140d781dfaf0c421f89ebfbba28d864.scope: Deactivated successfully.
Dec 01 09:12:51 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: ignoring --setuser ceph since I am not root
Dec 01 09:12:51 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: ignoring --setgroup ceph since I am not root
Dec 01 09:12:51 compute-0 ceph-mgr[75324]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Dec 01 09:12:51 compute-0 ceph-mgr[75324]: pidfile_write: ignore empty --pid-file
Dec 01 09:12:51 compute-0 podman[76060]: 2025-12-01 09:12:51.952811575 +0000 UTC m=+0.043085946 container create 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:12:51 compute-0 systemd[1]: Started libpod-conmon-74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda.scope.
Dec 01 09:12:52 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:52 compute-0 podman[76060]: 2025-12-01 09:12:52.031147313 +0000 UTC m=+0.121421714 container init 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:12:52 compute-0 podman[76060]: 2025-12-01 09:12:51.935505373 +0000 UTC m=+0.025779784 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:52 compute-0 podman[76060]: 2025-12-01 09:12:52.037416011 +0000 UTC m=+0.127690382 container start 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:12:52 compute-0 podman[76060]: 2025-12-01 09:12:52.040693974 +0000 UTC m=+0.130968375 container attach 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:12:52 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'alerts'
Dec 01 09:12:52 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:52.373+0000 7fd338ad4140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:12:52 compute-0 ceph-mgr[75324]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:12:52 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'balancer'
Dec 01 09:12:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Dec 01 09:12:52 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1686219134' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 09:12:52 compute-0 distracted_solomon[76100]: {
Dec 01 09:12:52 compute-0 distracted_solomon[76100]:     "epoch": 5,
Dec 01 09:12:52 compute-0 distracted_solomon[76100]:     "available": true,
Dec 01 09:12:52 compute-0 distracted_solomon[76100]:     "active_name": "compute-0.psduho",
Dec 01 09:12:52 compute-0 distracted_solomon[76100]:     "num_standby": 0
Dec 01 09:12:52 compute-0 distracted_solomon[76100]: }
Dec 01 09:12:52 compute-0 systemd[1]: libpod-74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda.scope: Deactivated successfully.
Dec 01 09:12:52 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:52.646+0000 7fd338ad4140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:12:52 compute-0 ceph-mgr[75324]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:12:52 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'cephadm'
Dec 01 09:12:52 compute-0 podman[76126]: 2025-12-01 09:12:52.670646196 +0000 UTC m=+0.026419862 container died 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:12:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-abc68b8bfee08fe848bc987b6a9c311f5ea87ec70cd30290aa47435cc3b00937-merged.mount: Deactivated successfully.
Dec 01 09:12:52 compute-0 podman[76126]: 2025-12-01 09:12:52.706821634 +0000 UTC m=+0.062595280 container remove 74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda (image=quay.io/ceph/ceph:v18, name=distracted_solomon, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:12:52 compute-0 systemd[1]: libpod-conmon-74c346a3d2c16b88400f2fddb0f3b7a162d9890ba292b6b7b259b5876259abda.scope: Deactivated successfully.
Dec 01 09:12:52 compute-0 podman[76141]: 2025-12-01 09:12:52.754745257 +0000 UTC m=+0.020720400 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:12:52 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2695057660' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished
Dec 01 09:12:52 compute-0 ceph-mon[75031]: mgrmap e5: compute-0.psduho(active, since 5s)
Dec 01 09:12:52 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1686219134' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 09:12:52 compute-0 podman[76141]: 2025-12-01 09:12:52.92332112 +0000 UTC m=+0.189296243 container create 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:52 compute-0 systemd[1]: Started libpod-conmon-237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1.scope.
Dec 01 09:12:52 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:12:52 compute-0 podman[76141]: 2025-12-01 09:12:52.996164391 +0000 UTC m=+0.262139544 container init 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:53 compute-0 podman[76141]: 2025-12-01 09:12:53.000879095 +0000 UTC m=+0.266854218 container start 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:12:53 compute-0 podman[76141]: 2025-12-01 09:12:53.004523909 +0000 UTC m=+0.270499032 container attach 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:12:54 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'crash'
Dec 01 09:12:54 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:54.939+0000 7fd338ad4140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:12:54 compute-0 ceph-mgr[75324]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:12:54 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'dashboard'
Dec 01 09:12:56 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'devicehealth'
Dec 01 09:12:56 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:56.732+0000 7fd338ad4140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:12:56 compute-0 ceph-mgr[75324]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:12:56 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'diskprediction_local'
Dec 01 09:12:57 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 01 09:12:57 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 01 09:12:57 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]:   from numpy import show_config as show_numpy_config
Dec 01 09:12:57 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:57.314+0000 7fd338ad4140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:12:57 compute-0 ceph-mgr[75324]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:12:57 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'influx'
Dec 01 09:12:57 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:57.595+0000 7fd338ad4140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:12:57 compute-0 ceph-mgr[75324]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:12:57 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'insights'
Dec 01 09:12:57 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'iostat'
Dec 01 09:12:58 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:12:58.109+0000 7fd338ad4140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:12:58 compute-0 ceph-mgr[75324]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:12:58 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'k8sevents'
Dec 01 09:12:59 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'localpool'
Dec 01 09:13:00 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'mds_autoscaler'
Dec 01 09:13:00 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'mirroring'
Dec 01 09:13:01 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'nfs'
Dec 01 09:13:01 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:01.955+0000 7fd338ad4140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:13:01 compute-0 ceph-mgr[75324]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:13:01 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'orchestrator'
Dec 01 09:13:02 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:02.680+0000 7fd338ad4140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:13:02 compute-0 ceph-mgr[75324]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:13:02 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'osd_perf_query'
Dec 01 09:13:02 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:02.978+0000 7fd338ad4140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:13:02 compute-0 ceph-mgr[75324]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:13:02 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'osd_support'
Dec 01 09:13:03 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:03.251+0000 7fd338ad4140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:13:03 compute-0 ceph-mgr[75324]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:13:03 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'pg_autoscaler'
Dec 01 09:13:03 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:03.549+0000 7fd338ad4140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:13:03 compute-0 ceph-mgr[75324]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:13:03 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'progress'
Dec 01 09:13:03 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:03.789+0000 7fd338ad4140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:13:03 compute-0 ceph-mgr[75324]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:13:03 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'prometheus'
Dec 01 09:13:04 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:04.860+0000 7fd338ad4140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:13:04 compute-0 ceph-mgr[75324]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:13:04 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'rbd_support'
Dec 01 09:13:05 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:05.184+0000 7fd338ad4140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:13:05 compute-0 ceph-mgr[75324]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:13:05 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'restful'
Dec 01 09:13:05 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'rgw'
Dec 01 09:13:06 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:06.688+0000 7fd338ad4140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:13:06 compute-0 ceph-mgr[75324]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:13:06 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'rook'
Dec 01 09:13:09 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:09.058+0000 7fd338ad4140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:13:09 compute-0 ceph-mgr[75324]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:13:09 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'selftest'
Dec 01 09:13:09 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:09.321+0000 7fd338ad4140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:13:09 compute-0 ceph-mgr[75324]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:13:09 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'snap_schedule'
Dec 01 09:13:09 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:09.594+0000 7fd338ad4140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:13:09 compute-0 ceph-mgr[75324]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:13:09 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'stats'
Dec 01 09:13:09 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'status'
Dec 01 09:13:10 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:10.173+0000 7fd338ad4140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:13:10 compute-0 ceph-mgr[75324]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:13:10 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'telegraf'
Dec 01 09:13:10 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:10.431+0000 7fd338ad4140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:13:10 compute-0 ceph-mgr[75324]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:13:10 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'telemetry'
Dec 01 09:13:11 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:11.136+0000 7fd338ad4140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:13:11 compute-0 ceph-mgr[75324]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:13:11 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'test_orchestrator'
Dec 01 09:13:11 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:11.898+0000 7fd338ad4140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:13:11 compute-0 ceph-mgr[75324]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:13:11 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'volumes'
Dec 01 09:13:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:12.682+0000 7fd338ad4140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: mgr[py] Loading python module 'zabbix'
Dec 01 09:13:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:13:12.923+0000 7fd338ad4140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Active manager daemon compute-0.psduho restarted
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 do_prune osdmap full prune enabled
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Activating manager daemon compute-0.psduho
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: ms_deliver_dispatch: unhandled message 0x56106644b1e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e1 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 e2: 0 total, 0 up, 0 in
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: mgr handle_mgr_map Activating!
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: mgr handle_mgr_map I am now activating
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e2: 0 total, 0 up, 0 in
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e6: compute-0.psduho(active, starting, since 0.0169613s)
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0) v1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"} v 0) v1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata"} v 0) v1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e1 all = 1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata"} v 0) v1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: balancer
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Manager daemon compute-0.psduho is now available
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Starting
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:13:12
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [balancer INFO root] No pools available
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.migrations] Found migration_current of "None". Setting to last migration.
Dec 01 09:13:12 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Found migration_current of "None". Setting to last migration.
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/migration_current}] v 0) v1
Dec 01 09:13:12 compute-0 ceph-mon[75031]: Active manager daemon compute-0.psduho restarted
Dec 01 09:13:12 compute-0 ceph-mon[75031]: Activating manager daemon compute-0.psduho
Dec 01 09:13:12 compute-0 ceph-mon[75031]: osdmap e2: 0 total, 0 up, 0 in
Dec 01 09:13:12 compute-0 ceph-mon[75031]: mgrmap e6: compute-0.psduho(active, starting, since 0.0169613s)
Dec 01 09:13:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr metadata", "who": "compute-0.psduho", "id": "compute-0.psduho"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 01 09:13:12 compute-0 ceph-mon[75031]: Manager daemon compute-0.psduho is now available
Dec 01 09:13:12 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/config_checks}] v 0) v1
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: cephadm
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: crash
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: devicehealth
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [devicehealth INFO root] Starting
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: iostat
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: nfs
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: orchestrator
Dec 01 09:13:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: pg_autoscaler
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: progress
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [progress INFO root] Loading...
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [progress INFO root] No stored events to load
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [progress INFO root] Loaded [] historic events
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [progress INFO root] Loaded OSDMap, ready.
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:13:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] recovery thread starting
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] starting setup
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: rbd_support
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: restful
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: status
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [restful INFO root] server_addr: :: server_port: 8003
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [restful WARNING root] server not running: no certificate configured
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: telemetry
Dec 01 09:13:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"} v 0) v1
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] PerfHandler: starting
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TaskHandler: starting
Dec 01 09:13:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"} v 0) v1
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] setup complete
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: mgr load Constructed class from module: volumes
Dec 01 09:13:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/cephadm_agent/root/cert}] v 0) v1
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/cephadm_agent/root/key}] v 0) v1
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:13 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e7: compute-0.psduho(active, since 1.0245s)
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14136 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch
Dec 01 09:13:13 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14136 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch
Dec 01 09:13:13 compute-0 silly_gauss[76158]: {
Dec 01 09:13:13 compute-0 silly_gauss[76158]:     "mgrmap_epoch": 7,
Dec 01 09:13:13 compute-0 silly_gauss[76158]:     "initialized": true
Dec 01 09:13:13 compute-0 silly_gauss[76158]: }
Dec 01 09:13:13 compute-0 systemd[1]: libpod-237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1.scope: Deactivated successfully.
Dec 01 09:13:13 compute-0 podman[76141]: 2025-12-01 09:13:13.983281453 +0000 UTC m=+21.249256596 container died 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:13:14 compute-0 ceph-mon[75031]: Found migration_current of "None". Setting to last migration.
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/mirror_snapshot_schedule"}]: dispatch
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.psduho/trash_purge_schedule"}]: dispatch
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:14 compute-0 ceph-mon[75031]: mgrmap e7: compute-0.psduho(active, since 1.0245s)
Dec 01 09:13:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-533f47fcdf700e821496873914eb663c7b4db46799c7c8f6833ce92cb64b484c-merged.mount: Deactivated successfully.
Dec 01 09:13:14 compute-0 podman[76141]: 2025-12-01 09:13:14.033849153 +0000 UTC m=+21.299824276 container remove 237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1 (image=quay.io/ceph/ceph:v18, name=silly_gauss, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:13:14 compute-0 systemd[1]: libpod-conmon-237e1ac54baec5a09fe63b3699c71c6d8c985c0ecb435aaa9d5c22d49ce47ea1.scope: Deactivated successfully.
Dec 01 09:13:14 compute-0 podman[76317]: 2025-12-01 09:13:14.103891552 +0000 UTC m=+0.045369677 container create 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:13:14 compute-0 systemd[1]: Started libpod-conmon-201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c.scope.
Dec 01 09:13:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:14 compute-0 podman[76317]: 2025-12-01 09:13:14.168257602 +0000 UTC m=+0.109735727 container init 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:14 compute-0 podman[76317]: 2025-12-01 09:13:14.174859698 +0000 UTC m=+0.116337813 container start 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:14 compute-0 podman[76317]: 2025-12-01 09:13:14.177714723 +0000 UTC m=+0.119192878 container attach 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:14 compute-0 podman[76317]: 2025-12-01 09:13:14.084419244 +0000 UTC m=+0.025897379 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:14 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/orchestrator/orchestrator}] v 0) v1
Dec 01 09:13:14 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec 01 09:13:14 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:14 compute-0 systemd[1]: libpod-201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c.scope: Deactivated successfully.
Dec 01 09:13:14 compute-0 podman[76317]: 2025-12-01 09:13:14.773311848 +0000 UTC m=+0.714789963 container died 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:13:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-b31016c66e5295f453bf89da43e12b3696433ba52c0994a26dde030a8f94cfbb-merged.mount: Deactivated successfully.
Dec 01 09:13:14 compute-0 podman[76317]: 2025-12-01 09:13:14.814628314 +0000 UTC m=+0.756106429 container remove 201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c (image=quay.io/ceph/ceph:v18, name=pedantic_shockley, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:14 compute-0 systemd[1]: libpod-conmon-201eeb12d87ce8c2a1c5dcdfd120eacaab0cde173818034ff85b5c986f40cc0c.scope: Deactivated successfully.
Dec 01 09:13:14 compute-0 podman[76372]: 2025-12-01 09:13:14.873997965 +0000 UTC m=+0.037827873 container create 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef)
Dec 01 09:13:14 compute-0 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:14] ENGINE Bus STARTING
Dec 01 09:13:14 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:14] ENGINE Bus STARTING
Dec 01 09:13:14 compute-0 systemd[1]: Started libpod-conmon-1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d.scope.
Dec 01 09:13:14 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:14 compute-0 podman[76372]: 2025-12-01 09:13:14.857478585 +0000 UTC m=+0.021308513 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Serving on http://192.168.122.100:8765
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Serving on http://192.168.122.100:8765
Dec 01 09:13:15 compute-0 podman[76372]: 2025-12-01 09:13:15.097472507 +0000 UTC m=+0.261302435 container init 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:13:15 compute-0 podman[76372]: 2025-12-01 09:13:15.102570068 +0000 UTC m=+0.266399976 container start 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Serving on https://192.168.122.100:7150
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Serving on https://192.168.122.100:7150
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Bus STARTED
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Bus STARTED
Dec 01 09:13:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec 01 09:13:15 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: [cephadm INFO cherrypy.error] [01/Dec/2025:09:13:15] ENGINE Client ('192.168.122.100', 55524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : [01/Dec/2025:09:13:15] ENGINE Client ('192.168.122.100', 55524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 01 09:13:15 compute-0 podman[76372]: 2025-12-01 09:13:15.27278434 +0000 UTC m=+0.436614278 container attach 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14146 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "ceph-admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_user}] v 0) v1
Dec 01 09:13:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_user
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_user
Dec 01 09:13:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_config}] v 0) v1
Dec 01 09:13:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_config
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_config
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: [cephadm INFO root] ssh user set to ceph-admin. sudo will be used
Dec 01 09:13:15 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : ssh user set to ceph-admin. sudo will be used
Dec 01 09:13:15 compute-0 wizardly_goodall[76399]: ssh user set to ceph-admin. sudo will be used
Dec 01 09:13:15 compute-0 systemd[1]: libpod-1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d.scope: Deactivated successfully.
Dec 01 09:13:15 compute-0 podman[76372]: 2025-12-01 09:13:15.6889674 +0000 UTC m=+0.852797308 container died 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-4730bdafe340a010fdaa835cc9e1c51938fc458a8e1816a68a54f086582c4e14-merged.mount: Deactivated successfully.
Dec 01 09:13:15 compute-0 podman[76372]: 2025-12-01 09:13:15.739521181 +0000 UTC m=+0.903351089 container remove 1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d (image=quay.io/ceph/ceph:v18, name=wizardly_goodall, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:15 compute-0 ceph-mon[75031]: [01/Dec/2025:09:13:14] ENGINE Bus STARTING
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:15 compute-0 systemd[1]: libpod-conmon-1ecf27cb202d69d67c54697b7e88065279f0c8fa4349df547038415fc43dcd0d.scope: Deactivated successfully.
Dec 01 09:13:15 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e8: compute-0.psduho(active, since 2s)
Dec 01 09:13:15 compute-0 podman[76448]: 2025-12-01 09:13:15.796918644 +0000 UTC m=+0.034485894 container create a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:13:15 compute-0 systemd[1]: Started libpod-conmon-a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8.scope.
Dec 01 09:13:15 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/tmp/cephadm-ssh-key supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/tmp/cephadm-ssh-key.pub supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:15 compute-0 podman[76448]: 2025-12-01 09:13:15.864036436 +0000 UTC m=+0.101603686 container init a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:13:15 compute-0 podman[76448]: 2025-12-01 09:13:15.869042174 +0000 UTC m=+0.106609424 container start a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:15 compute-0 podman[76448]: 2025-12-01 09:13:15.8719198 +0000 UTC m=+0.109487070 container attach a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:15 compute-0 podman[76448]: 2025-12-01 09:13:15.782355102 +0000 UTC m=+0.019922362 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:16 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14148 -' entity='client.admin' cmd=[{"prefix": "cephadm set-priv-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_identity_key}] v 0) v1
Dec 01 09:13:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:16 compute-0 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_identity_key
Dec 01 09:13:16 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_identity_key
Dec 01 09:13:16 compute-0 ceph-mgr[75324]: [cephadm INFO root] Set ssh private key
Dec 01 09:13:16 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh private key
Dec 01 09:13:16 compute-0 systemd[1]: libpod-a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8.scope: Deactivated successfully.
Dec 01 09:13:16 compute-0 podman[76448]: 2025-12-01 09:13:16.498776431 +0000 UTC m=+0.736343681 container died a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:13:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-35674a77d3051546ebbf85c0e5ec1de6b626355bbbc277db15c37d8176afb885-merged.mount: Deactivated successfully.
Dec 01 09:13:16 compute-0 podman[76448]: 2025-12-01 09:13:16.540414917 +0000 UTC m=+0.777982167 container remove a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8 (image=quay.io/ceph/ceph:v18, name=interesting_chaum, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:13:16 compute-0 systemd[1]: libpod-conmon-a4b7315fd0e37f25e0f03ce5f6f04581e0d9941aeb765dc4e54478825b9b87d8.scope: Deactivated successfully.
Dec 01 09:13:16 compute-0 podman[76504]: 2025-12-01 09:13:16.595382908 +0000 UTC m=+0.037223796 container create b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:13:16 compute-0 systemd[1]: Started libpod-conmon-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope.
Dec 01 09:13:16 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/tmp/cephadm-ssh-key supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/tmp/cephadm-ssh-key.pub supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:16 compute-0 podman[76504]: 2025-12-01 09:13:16.647898216 +0000 UTC m=+0.089739134 container init b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:13:16 compute-0 podman[76504]: 2025-12-01 09:13:16.655262815 +0000 UTC m=+0.097103703 container start b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:16 compute-0 podman[76504]: 2025-12-01 09:13:16.658193352 +0000 UTC m=+0.100034270 container attach b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef)
Dec 01 09:13:16 compute-0 podman[76504]: 2025-12-01 09:13:16.577635091 +0000 UTC m=+0.019475999 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:16 compute-0 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Serving on http://192.168.122.100:8765
Dec 01 09:13:16 compute-0 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Serving on https://192.168.122.100:7150
Dec 01 09:13:16 compute-0 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Bus STARTED
Dec 01 09:13:16 compute-0 ceph-mon[75031]: [01/Dec/2025:09:13:15] ENGINE Client ('192.168.122.100', 55524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 01 09:13:16 compute-0 ceph-mon[75031]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "ceph-admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:16 compute-0 ceph-mon[75031]: Set ssh ssh_user
Dec 01 09:13:16 compute-0 ceph-mon[75031]: Set ssh ssh_config
Dec 01 09:13:16 compute-0 ceph-mon[75031]: ssh user set to ceph-admin. sudo will be used
Dec 01 09:13:16 compute-0 ceph-mon[75031]: mgrmap e8: compute-0.psduho(active, since 2s)
Dec 01 09:13:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:16 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:17 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14150 -' entity='client.admin' cmd=[{"prefix": "cephadm set-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_identity_pub}] v 0) v1
Dec 01 09:13:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:17 compute-0 ceph-mgr[75324]: [cephadm INFO root] Set ssh ssh_identity_pub
Dec 01 09:13:17 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Set ssh ssh_identity_pub
Dec 01 09:13:17 compute-0 systemd[1]: libpod-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope: Deactivated successfully.
Dec 01 09:13:17 compute-0 conmon[76521]: conmon b0082ac6a6cfe79c40f4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope/container/memory.events
Dec 01 09:13:17 compute-0 podman[76504]: 2025-12-01 09:13:17.196523047 +0000 UTC m=+0.638363935 container died b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:13:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-a42446b91196bd36de3d655ae987bd74aadf5d1572355a6d885bbde74432781d-merged.mount: Deactivated successfully.
Dec 01 09:13:17 compute-0 podman[76504]: 2025-12-01 09:13:17.236833953 +0000 UTC m=+0.678674841 container remove b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3 (image=quay.io/ceph/ceph:v18, name=friendly_archimedes, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:13:17 compute-0 systemd[1]: libpod-conmon-b0082ac6a6cfe79c40f4c8244adba90a6d76fb9db8254c060e683e89075083c3.scope: Deactivated successfully.
Dec 01 09:13:17 compute-0 podman[76558]: 2025-12-01 09:13:17.29331975 +0000 UTC m=+0.038447822 container create 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:17 compute-0 systemd[1]: Started libpod-conmon-018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf.scope.
Dec 01 09:13:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:17 compute-0 podman[76558]: 2025-12-01 09:13:17.35163246 +0000 UTC m=+0.096760542 container init 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:13:17 compute-0 podman[76558]: 2025-12-01 09:13:17.356583627 +0000 UTC m=+0.101711689 container start 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:17 compute-0 podman[76558]: 2025-12-01 09:13:17.359624797 +0000 UTC m=+0.104752859 container attach 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:17 compute-0 podman[76558]: 2025-12-01 09:13:17.275601754 +0000 UTC m=+0.020729846 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:17 compute-0 ceph-mon[75031]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "cephadm set-priv-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:17 compute-0 ceph-mon[75031]: Set ssh ssh_identity_key
Dec 01 09:13:17 compute-0 ceph-mon[75031]: Set ssh private key
Dec 01 09:13:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:17 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14152 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:17 compute-0 eloquent_moore[76574]: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDD5sBspn5Xp7CW5nM5RD3iftue/dLFDaYdYMKUOd5ZIzzsN79oQmlltrvDT8HaitCYNC57cVlt7wHN9td+6mcvwmDWLz3o/V+FnCmxsy48GjfcC1QefkBt+998r7HDAx63TkBJ1854CvC2wj92eb6gIWJA7E2cOUv5PCGoqcgonQyMLYSYm4G9uoxpEJeXbBaB94cMGl+dCQbIg8yOceqUlnoZ2+ACyrkBUxbI9JmOBw29M1PMIaYdFW7urAEFJRiovkfYNPVJeMZZNYL4efjsE13flKatlgazaIJqjHCrthfeRq1Hj7qpaJYaubPdhUFuXb2qqTVg/lOwO/R4VJVZDbyOOpncF0p5pv4pZkvb3qMGCM605lN8C8aHi8734oLSBYIDtVMA4HgPLo6nbUtCrzvqfceioWkYNymprvj5Wm/jN1gAtEf9mf4ZPu0uuzHCLbku5lddg770u13ZPqylHCrgjIxnrvb4jygvTBd1myq7uHVdI/518cEH53q0hA0= zuul@controller
Dec 01 09:13:17 compute-0 systemd[1]: libpod-018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf.scope: Deactivated successfully.
Dec 01 09:13:17 compute-0 podman[76558]: 2025-12-01 09:13:17.912421382 +0000 UTC m=+0.657549484 container died 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1019921889 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-2c8c0f145ad1053bea93ac7a3845cb01d29d92d1f875b2726294470e9fca2d71-merged.mount: Deactivated successfully.
Dec 01 09:13:18 compute-0 podman[76558]: 2025-12-01 09:13:18.02657624 +0000 UTC m=+0.771704302 container remove 018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf (image=quay.io/ceph/ceph:v18, name=eloquent_moore, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:13:18 compute-0 systemd[1]: libpod-conmon-018c8005618a67f05ad9a01548fbb28979b9d2875c234e37cd7bd4015b43acdf.scope: Deactivated successfully.
Dec 01 09:13:18 compute-0 podman[76612]: 2025-12-01 09:13:18.080284543 +0000 UTC m=+0.035610287 container create 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:18 compute-0 systemd[1]: Started libpod-conmon-91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172.scope.
Dec 01 09:13:18 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:18 compute-0 podman[76612]: 2025-12-01 09:13:18.143084647 +0000 UTC m=+0.098410391 container init 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 01 09:13:18 compute-0 podman[76612]: 2025-12-01 09:13:18.149662452 +0000 UTC m=+0.104988216 container start 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:13:18 compute-0 podman[76612]: 2025-12-01 09:13:18.153567428 +0000 UTC m=+0.108893272 container attach 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:18 compute-0 podman[76612]: 2025-12-01 09:13:18.065167405 +0000 UTC m=+0.020493169 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:18 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "compute-0", "addr": "192.168.122.100", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:18 compute-0 ceph-mon[75031]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "cephadm set-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:18 compute-0 ceph-mon[75031]: Set ssh ssh_identity_pub
Dec 01 09:13:18 compute-0 ceph-mon[75031]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:18 compute-0 sshd-session[76654]: Accepted publickey for ceph-admin from 192.168.122.100 port 47854 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:18 compute-0 systemd[1]: Created slice User Slice of UID 42477.
Dec 01 09:13:18 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 01 09:13:18 compute-0 systemd-logind[788]: New session 21 of user ceph-admin.
Dec 01 09:13:18 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:18 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 01 09:13:18 compute-0 systemd[1]: Starting User Manager for UID 42477...
Dec 01 09:13:18 compute-0 systemd[76658]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:19 compute-0 sshd-session[76671]: Accepted publickey for ceph-admin from 192.168.122.100 port 47864 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:19 compute-0 systemd[76658]: Queued start job for default target Main User Target.
Dec 01 09:13:19 compute-0 systemd-logind[788]: New session 23 of user ceph-admin.
Dec 01 09:13:19 compute-0 systemd[76658]: Created slice User Application Slice.
Dec 01 09:13:19 compute-0 systemd[76658]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 01 09:13:19 compute-0 systemd[76658]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 09:13:19 compute-0 systemd[76658]: Reached target Paths.
Dec 01 09:13:19 compute-0 systemd[76658]: Reached target Timers.
Dec 01 09:13:19 compute-0 systemd[76658]: Starting D-Bus User Message Bus Socket...
Dec 01 09:13:19 compute-0 systemd[76658]: Starting Create User's Volatile Files and Directories...
Dec 01 09:13:19 compute-0 systemd[76658]: Finished Create User's Volatile Files and Directories.
Dec 01 09:13:19 compute-0 systemd[76658]: Listening on D-Bus User Message Bus Socket.
Dec 01 09:13:19 compute-0 systemd[76658]: Reached target Sockets.
Dec 01 09:13:19 compute-0 systemd[76658]: Reached target Basic System.
Dec 01 09:13:19 compute-0 systemd[76658]: Reached target Main User Target.
Dec 01 09:13:19 compute-0 systemd[76658]: Startup finished in 149ms.
Dec 01 09:13:19 compute-0 systemd[1]: Started User Manager for UID 42477.
Dec 01 09:13:19 compute-0 systemd[1]: Started Session 21 of User ceph-admin.
Dec 01 09:13:19 compute-0 systemd[1]: Started Session 23 of User ceph-admin.
Dec 01 09:13:19 compute-0 sshd-session[76654]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:19 compute-0 sshd-session[76671]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:19 compute-0 sudo[76678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:19 compute-0 sudo[76678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:19 compute-0 sudo[76678]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:19 compute-0 sudo[76703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:19 compute-0 sudo[76703]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:19 compute-0 sudo[76703]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:19 compute-0 sshd-session[76728]: Accepted publickey for ceph-admin from 192.168.122.100 port 47868 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:19 compute-0 systemd-logind[788]: New session 24 of user ceph-admin.
Dec 01 09:13:19 compute-0 systemd[1]: Started Session 24 of User ceph-admin.
Dec 01 09:13:19 compute-0 sshd-session[76728]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:19 compute-0 sudo[76732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:19 compute-0 sudo[76732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:19 compute-0 sudo[76732]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:19 compute-0 sudo[76757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host --expect-hostname compute-0
Dec 01 09:13:19 compute-0 sudo[76757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:19 compute-0 sudo[76757]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:19 compute-0 ceph-mon[75031]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "compute-0", "addr": "192.168.122.100", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:19 compute-0 sshd-session[76782]: Accepted publickey for ceph-admin from 192.168.122.100 port 47872 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:19 compute-0 systemd-logind[788]: New session 25 of user ceph-admin.
Dec 01 09:13:19 compute-0 systemd[1]: Started Session 25 of User ceph-admin.
Dec 01 09:13:19 compute-0 sshd-session[76782]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:19 compute-0 sudo[76786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:19 compute-0 sudo[76786]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:19 compute-0 sudo[76786]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:20 compute-0 sudo[76811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d
Dec 01 09:13:20 compute-0 sudo[76811]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:20 compute-0 sudo[76811]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:20 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying cephadm binary to compute-0
Dec 01 09:13:20 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying cephadm binary to compute-0
Dec 01 09:13:20 compute-0 sshd-session[76836]: Accepted publickey for ceph-admin from 192.168.122.100 port 47888 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:20 compute-0 systemd-logind[788]: New session 26 of user ceph-admin.
Dec 01 09:13:20 compute-0 systemd[1]: Started Session 26 of User ceph-admin.
Dec 01 09:13:20 compute-0 sshd-session[76836]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:20 compute-0 sudo[76840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:20 compute-0 sudo[76840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:20 compute-0 sudo[76840]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:20 compute-0 sudo[76865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:20 compute-0 sudo[76865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:20 compute-0 sudo[76865]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:20 compute-0 sshd-session[76890]: Accepted publickey for ceph-admin from 192.168.122.100 port 47896 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:20 compute-0 systemd-logind[788]: New session 27 of user ceph-admin.
Dec 01 09:13:20 compute-0 systemd[1]: Started Session 27 of User ceph-admin.
Dec 01 09:13:20 compute-0 sshd-session[76890]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:20 compute-0 sudo[76894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:20 compute-0 sudo[76894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:20 compute-0 sudo[76894]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:20 compute-0 sudo[76919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:20 compute-0 sudo[76919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:20 compute-0 sudo[76919]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:20 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:20 compute-0 ceph-mon[75031]: Deploying cephadm binary to compute-0
Dec 01 09:13:21 compute-0 sshd-session[76944]: Accepted publickey for ceph-admin from 192.168.122.100 port 47908 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:21 compute-0 systemd-logind[788]: New session 28 of user ceph-admin.
Dec 01 09:13:21 compute-0 systemd[1]: Started Session 28 of User ceph-admin.
Dec 01 09:13:21 compute-0 sshd-session[76944]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:21 compute-0 sudo[76948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:21 compute-0 sudo[76948]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:21 compute-0 sudo[76948]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:21 compute-0 sudo[76973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new
Dec 01 09:13:21 compute-0 sudo[76973]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:21 compute-0 sudo[76973]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:21 compute-0 sshd-session[76998]: Accepted publickey for ceph-admin from 192.168.122.100 port 47912 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:21 compute-0 systemd-logind[788]: New session 29 of user ceph-admin.
Dec 01 09:13:21 compute-0 systemd[1]: Started Session 29 of User ceph-admin.
Dec 01 09:13:21 compute-0 sshd-session[76998]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:21 compute-0 sudo[77002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:21 compute-0 sudo[77002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:21 compute-0 sudo[77002]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:21 compute-0 sudo[77027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:21 compute-0 sudo[77027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:21 compute-0 sudo[77027]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:21 compute-0 sshd-session[77052]: Accepted publickey for ceph-admin from 192.168.122.100 port 47928 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:21 compute-0 systemd-logind[788]: New session 30 of user ceph-admin.
Dec 01 09:13:21 compute-0 systemd[1]: Started Session 30 of User ceph-admin.
Dec 01 09:13:21 compute-0 sshd-session[77052]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:22 compute-0 sudo[77056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:22 compute-0 sudo[77056]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:22 compute-0 sudo[77056]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:22 compute-0 sudo[77081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new
Dec 01 09:13:22 compute-0 sudo[77081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:22 compute-0 sudo[77081]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:22 compute-0 sshd-session[77106]: Accepted publickey for ceph-admin from 192.168.122.100 port 47932 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:22 compute-0 systemd-logind[788]: New session 31 of user ceph-admin.
Dec 01 09:13:22 compute-0 systemd[1]: Started Session 31 of User ceph-admin.
Dec 01 09:13:22 compute-0 sshd-session[77106]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:22 compute-0 sshd-session[77133]: Accepted publickey for ceph-admin from 192.168.122.100 port 47934 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:22 compute-0 systemd-logind[788]: New session 32 of user ceph-admin.
Dec 01 09:13:22 compute-0 systemd[1]: Started Session 32 of User ceph-admin.
Dec 01 09:13:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020053034 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:22 compute-0 sshd-session[77133]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:22 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:23 compute-0 sudo[77137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:23 compute-0 sudo[77137]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:23 compute-0 sudo[77137]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:23 compute-0 sudo[77162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d
Dec 01 09:13:23 compute-0 sudo[77162]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:23 compute-0 sudo[77162]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:23 compute-0 sshd-session[77187]: Accepted publickey for ceph-admin from 192.168.122.100 port 47946 ssh2: RSA SHA256:dOPvQuuIN3kx2Wedm5OOoD6LbPjc9bLwbWo2YXzVB2E
Dec 01 09:13:23 compute-0 systemd-logind[788]: New session 33 of user ceph-admin.
Dec 01 09:13:23 compute-0 systemd[1]: Started Session 33 of User ceph-admin.
Dec 01 09:13:23 compute-0 sshd-session[77187]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:13:23 compute-0 sudo[77191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:23 compute-0 sudo[77191]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:23 compute-0 sudo[77191]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:23 compute-0 sudo[77216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host --expect-hostname compute-0
Dec 01 09:13:23 compute-0 sudo[77216]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:23 compute-0 sudo[77216]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec 01 09:13:23 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:23 compute-0 ceph-mgr[75324]: [cephadm INFO root] Added host compute-0
Dec 01 09:13:23 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Added host compute-0
Dec 01 09:13:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec 01 09:13:23 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:23 compute-0 zen_mahavira[76628]: Added host 'compute-0' with addr '192.168.122.100'
Dec 01 09:13:23 compute-0 systemd[1]: libpod-91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172.scope: Deactivated successfully.
Dec 01 09:13:23 compute-0 podman[76612]: 2025-12-01 09:13:23.914857286 +0000 UTC m=+5.870183030 container died 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:23 compute-0 sudo[77261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:23 compute-0 sudo[77261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:23 compute-0 sudo[77261]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:23 compute-0 sudo[77299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:23 compute-0 sudo[77299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:23 compute-0 sudo[77299]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-746a7322c464db3f4e8fd9d40f5c3e04af5c9ab61050d1af559f36551205884d-merged.mount: Deactivated successfully.
Dec 01 09:13:24 compute-0 sudo[77324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:24 compute-0 sudo[77324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:24 compute-0 sudo[77324]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:24 compute-0 podman[76612]: 2025-12-01 09:13:24.063419045 +0000 UTC m=+6.018744789 container remove 91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172 (image=quay.io/ceph/ceph:v18, name=zen_mahavira, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:13:24 compute-0 systemd[1]: libpod-conmon-91a57114cca0cf3362a3a88a1b751240f71868452258989b1d2e51e2e85af172.scope: Deactivated successfully.
Dec 01 09:13:24 compute-0 sudo[77351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph:v18 --timeout 895 inspect-image
Dec 01 09:13:24 compute-0 sudo[77351]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:24 compute-0 podman[77364]: 2025-12-01 09:13:24.130483105 +0000 UTC m=+0.042742539 container create f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:24 compute-0 systemd[1]: Started libpod-conmon-f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94.scope.
Dec 01 09:13:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:24 compute-0 podman[77364]: 2025-12-01 09:13:24.199784112 +0000 UTC m=+0.112043576 container init f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:24 compute-0 podman[77364]: 2025-12-01 09:13:24.205601844 +0000 UTC m=+0.117861288 container start f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:24 compute-0 podman[77364]: 2025-12-01 09:13:24.114868912 +0000 UTC m=+0.027128376 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:24 compute-0 podman[77364]: 2025-12-01 09:13:24.214491028 +0000 UTC m=+0.126750502 container attach f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:24 compute-0 podman[77424]: 2025-12-01 09:13:24.532205937 +0000 UTC m=+0.074000907 container create 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:13:24 compute-0 systemd[1]: Started libpod-conmon-6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45.scope.
Dec 01 09:13:24 compute-0 podman[77424]: 2025-12-01 09:13:24.490607472 +0000 UTC m=+0.032402442 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:24 compute-0 podman[77424]: 2025-12-01 09:13:24.604232284 +0000 UTC m=+0.146027284 container init 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:13:24 compute-0 podman[77424]: 2025-12-01 09:13:24.61049828 +0000 UTC m=+0.152293240 container start 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec 01 09:13:24 compute-0 podman[77424]: 2025-12-01 09:13:24.613589352 +0000 UTC m=+0.155384312 container attach 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:13:24 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:24 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service mon spec with placement count:5
Dec 01 09:13:24 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mon spec with placement count:5
Dec 01 09:13:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec 01 09:13:24 compute-0 bold_pasteur[77459]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)
Dec 01 09:13:24 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:24 compute-0 priceless_herschel[77392]: Scheduled mon update...
Dec 01 09:13:24 compute-0 systemd[1]: libpod-6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45.scope: Deactivated successfully.
Dec 01 09:13:24 compute-0 podman[77424]: 2025-12-01 09:13:24.924388765 +0000 UTC m=+0.466183735 container died 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:24 compute-0 systemd[1]: libpod-f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94.scope: Deactivated successfully.
Dec 01 09:13:24 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:25 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:25 compute-0 ceph-mon[75031]: Added host compute-0
Dec 01 09:13:25 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:13:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-d16d0e4ee2e4e24f339ac8a406f324b8b3a879513f4537f785e8cbf09f7c7e9b-merged.mount: Deactivated successfully.
Dec 01 09:13:25 compute-0 podman[77424]: 2025-12-01 09:13:25.460630228 +0000 UTC m=+1.002425198 container remove 6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45 (image=quay.io/ceph/ceph:v18, name=bold_pasteur, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:13:25 compute-0 systemd[1]: libpod-conmon-6b389e753c258af03b5545496acb441c12fd933934e4fb6461759d3c1e6a7f45.scope: Deactivated successfully.
Dec 01 09:13:25 compute-0 podman[77364]: 2025-12-01 09:13:25.475152709 +0000 UTC m=+1.387412153 container died f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:13:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-e9d5a3b5a8d5b9bb446cd4268cb5f458c2969d3fb49469135ba214f3d46b8043-merged.mount: Deactivated successfully.
Dec 01 09:13:25 compute-0 sudo[77351]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=container_image}] v 0) v1
Dec 01 09:13:25 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:25 compute-0 podman[77472]: 2025-12-01 09:13:25.514679902 +0000 UTC m=+0.563275436 container remove f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94 (image=quay.io/ceph/ceph:v18, name=priceless_herschel, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:13:25 compute-0 systemd[1]: libpod-conmon-f23d81df70052d5a22e44c2a60d10039226fd70ef11f7c74e528647948a2af94.scope: Deactivated successfully.
Dec 01 09:13:25 compute-0 sudo[77492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:25 compute-0 sudo[77492]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:25 compute-0 podman[77493]: 2025-12-01 09:13:25.574861858 +0000 UTC m=+0.036354490 container create 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default)
Dec 01 09:13:25 compute-0 sudo[77492]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:25 compute-0 systemd[1]: Started libpod-conmon-33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8.scope.
Dec 01 09:13:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:25 compute-0 sudo[77531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:25 compute-0 sudo[77531]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:25 compute-0 sudo[77531]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:25 compute-0 podman[77493]: 2025-12-01 09:13:25.651968937 +0000 UTC m=+0.113461579 container init 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:13:25 compute-0 podman[77493]: 2025-12-01 09:13:25.558927695 +0000 UTC m=+0.020420347 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:25 compute-0 podman[77493]: 2025-12-01 09:13:25.658844711 +0000 UTC m=+0.120337363 container start 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:13:25 compute-0 podman[77493]: 2025-12-01 09:13:25.66218896 +0000 UTC m=+0.123681602 container attach 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:13:25 compute-0 sudo[77561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:25 compute-0 sudo[77561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:25 compute-0 sudo[77561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:25 compute-0 sudo[77589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Dec 01 09:13:25 compute-0 sudo[77589]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:25 compute-0 sudo[77589]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:26 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:26 compute-0 sudo[77645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:26 compute-0 sudo[77645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:26 compute-0 sudo[77645]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:26 compute-0 sudo[77679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:26 compute-0 sudo[77679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:26 compute-0 sudo[77679]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:26 compute-0 ceph-mon[75031]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:26 compute-0 ceph-mon[75031]: Saving service mon spec with placement count:5
Dec 01 09:13:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:26 compute-0 sudo[77704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:26 compute-0 sudo[77704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:26 compute-0 sudo[77704]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:26 compute-0 sudo[77729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:13:26 compute-0 sudo[77729]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:26 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14158 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:26 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service mgr spec with placement count:2
Dec 01 09:13:26 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mgr spec with placement count:2
Dec 01 09:13:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec 01 09:13:26 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:26 compute-0 youthful_curran[77553]: Scheduled mgr update...
Dec 01 09:13:26 compute-0 systemd[1]: libpod-33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8.scope: Deactivated successfully.
Dec 01 09:13:26 compute-0 podman[77493]: 2025-12-01 09:13:26.29602742 +0000 UTC m=+0.757520062 container died 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:13:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-baf28985a89a43298a0e5fe91f7a6d3f044b1a33233dcfb039654d6560a25169-merged.mount: Deactivated successfully.
Dec 01 09:13:26 compute-0 podman[77493]: 2025-12-01 09:13:26.336463669 +0000 UTC m=+0.797956301 container remove 33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8 (image=quay.io/ceph/ceph:v18, name=youthful_curran, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:26 compute-0 systemd[1]: libpod-conmon-33c746197e121554bba70bb7341da610a91f1c51096d1dfa822e02d9b56aadc8.scope: Deactivated successfully.
Dec 01 09:13:26 compute-0 podman[77771]: 2025-12-01 09:13:26.400457589 +0000 UTC m=+0.043945836 container create c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:13:26 compute-0 systemd[1]: Started libpod-conmon-c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5.scope.
Dec 01 09:13:26 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:26 compute-0 podman[77771]: 2025-12-01 09:13:26.474839456 +0000 UTC m=+0.118327743 container init c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:26 compute-0 podman[77771]: 2025-12-01 09:13:26.382872287 +0000 UTC m=+0.026360554 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:26 compute-0 podman[77771]: 2025-12-01 09:13:26.482352119 +0000 UTC m=+0.125840366 container start c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:13:26 compute-0 podman[77771]: 2025-12-01 09:13:26.48575276 +0000 UTC m=+0.129241047 container attach c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:26 compute-0 podman[77864]: 2025-12-01 09:13:26.853306187 +0000 UTC m=+0.161400780 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:26 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:27 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14160 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:27 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service crash spec with placement *
Dec 01 09:13:27 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service crash spec with placement *
Dec 01 09:13:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0) v1
Dec 01 09:13:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:27 compute-0 sweet_chebyshev[77812]: Scheduled crash update...
Dec 01 09:13:27 compute-0 systemd[1]: libpod-c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5.scope: Deactivated successfully.
Dec 01 09:13:27 compute-0 podman[77771]: 2025-12-01 09:13:27.122110234 +0000 UTC m=+0.765598501 container died c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-7f0a310e2443170f6fde92a8407bbdd079f9ac5a5daf5ab09e71ec63a5e04f2c-merged.mount: Deactivated successfully.
Dec 01 09:13:27 compute-0 podman[77771]: 2025-12-01 09:13:27.169055047 +0000 UTC m=+0.812543294 container remove c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5 (image=quay.io/ceph/ceph:v18, name=sweet_chebyshev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:13:27 compute-0 systemd[1]: libpod-conmon-c5aaa30a058f9a15041bdc9d9279550c90d9823b7bdeeba23c9381bbea32ccb5.scope: Deactivated successfully.
Dec 01 09:13:27 compute-0 podman[77864]: 2025-12-01 09:13:27.200687445 +0000 UTC m=+0.508782028 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:13:27 compute-0 podman[77919]: 2025-12-01 09:13:27.246976079 +0000 UTC m=+0.057209809 container create 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:13:27 compute-0 ceph-mon[75031]: from='client.14158 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:27 compute-0 ceph-mon[75031]: Saving service mgr spec with placement count:2
Dec 01 09:13:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:27 compute-0 systemd[1]: Started libpod-conmon-75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd.scope.
Dec 01 09:13:27 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:27 compute-0 podman[77919]: 2025-12-01 09:13:27.223378258 +0000 UTC m=+0.033612008 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:27 compute-0 podman[77919]: 2025-12-01 09:13:27.319717137 +0000 UTC m=+0.129950887 container init 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:13:27 compute-0 podman[77919]: 2025-12-01 09:13:27.327670733 +0000 UTC m=+0.137904463 container start 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:27 compute-0 podman[77919]: 2025-12-01 09:13:27.331079774 +0000 UTC m=+0.141313504 container attach 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:13:27 compute-0 sudo[77729]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:27 compute-0 sudo[77968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:27 compute-0 sudo[77968]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:27 compute-0 sudo[77968]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:27 compute-0 sudo[77993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:27 compute-0 sudo[77993]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:27 compute-0 sudo[77993]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:27 compute-0 sudo[78018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:27 compute-0 sudo[78018]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:27 compute-0 sudo[78018]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:27 compute-0 sudo[78043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:13:27 compute-0 sudo[78043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:27 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 78098 (sysctl)
Dec 01 09:13:27 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 01 09:13:27 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 01 09:13:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1
Dec 01 09:13:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2604732978' entity='client.admin' 
Dec 01 09:13:27 compute-0 systemd[1]: libpod-75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd.scope: Deactivated successfully.
Dec 01 09:13:27 compute-0 podman[77919]: 2025-12-01 09:13:27.888816196 +0000 UTC m=+0.699049926 container died 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:13:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-a1fa88f76e7c2e8d9e95454c4b4b0a75770a75a14c6c6ef50c8ce3abc6e4a0d5-merged.mount: Deactivated successfully.
Dec 01 09:13:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054710 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:27 compute-0 podman[77919]: 2025-12-01 09:13:27.934200773 +0000 UTC m=+0.744434503 container remove 75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd (image=quay.io/ceph/ceph:v18, name=elastic_feynman, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec 01 09:13:27 compute-0 systemd[1]: libpod-conmon-75413784da44089bed69d0d448e460a933514bfea123a7153a99dd5c2fdc3cbd.scope: Deactivated successfully.
Dec 01 09:13:27 compute-0 podman[78123]: 2025-12-01 09:13:27.996407019 +0000 UTC m=+0.042648367 container create 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:13:28 compute-0 systemd[1]: Started libpod-conmon-4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e.scope.
Dec 01 09:13:28 compute-0 sudo[78043]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:28 compute-0 podman[78123]: 2025-12-01 09:13:27.977770266 +0000 UTC m=+0.024011634 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:28 compute-0 podman[78123]: 2025-12-01 09:13:28.077694621 +0000 UTC m=+0.123935989 container init 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:13:28 compute-0 podman[78123]: 2025-12-01 09:13:28.083542184 +0000 UTC m=+0.129783532 container start 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default)
Dec 01 09:13:28 compute-0 podman[78123]: 2025-12-01 09:13:28.086629946 +0000 UTC m=+0.132871324 container attach 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:28 compute-0 sudo[78153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:28 compute-0 sudo[78153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:28 compute-0 sudo[78153]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 sudo[78180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:28 compute-0 sudo[78180]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:28 compute-0 sudo[78180]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 sudo[78205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:28 compute-0 sudo[78205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:28 compute-0 sudo[78205]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 sudo[78230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 list-networks
Dec 01 09:13:28 compute-0 sudo[78230]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:28 compute-0 sudo[78230]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 ceph-mon[75031]: from='client.14160 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:28 compute-0 ceph-mon[75031]: Saving service crash spec with placement *
Dec 01 09:13:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:28 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2604732978' entity='client.admin' 
Dec 01 09:13:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:28 compute-0 sudo[78293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:28 compute-0 sudo[78293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:28 compute-0 sudo[78293]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 sudo[78318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:28 compute-0 sudo[78318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:28 compute-0 sudo[78318]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 sudo[78343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:28 compute-0 sudo[78343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:28 compute-0 sudo[78343]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:28 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:28 compute-0 sudo[78368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- inventory --format=json-pretty --filter-for-batch
Dec 01 09:13:28 compute-0 sudo[78368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:29 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14164 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "label:_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/client_keyrings}] v 0) v1
Dec 01 09:13:29 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:29 compute-0 systemd[1]: libpod-4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e.scope: Deactivated successfully.
Dec 01 09:13:29 compute-0 podman[78123]: 2025-12-01 09:13:29.030739703 +0000 UTC m=+1.076981081 container died 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec 01 09:13:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-dbb4441c3f005973e0ad79d1c4bb1e5aa518d2e4ee346631f4cbfde85eab690d-merged.mount: Deactivated successfully.
Dec 01 09:13:29 compute-0 systemd[1]: libpod-conmon-4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e.scope: Deactivated successfully.
Dec 01 09:13:29 compute-0 podman[78123]: 2025-12-01 09:13:29.065886446 +0000 UTC m=+1.112127794 container remove 4c185033738ae51093916a8c5656129598cf9e47fd41d48b10b888641224fc9e (image=quay.io/ceph/ceph:v18, name=serene_diffie, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:29 compute-0 podman[78407]: 2025-12-01 09:13:29.136130171 +0000 UTC m=+0.042494312 container create dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:29 compute-0 systemd[1]: Started libpod-conmon-dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d.scope.
Dec 01 09:13:29 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:29 compute-0 podman[78407]: 2025-12-01 09:13:29.211562199 +0000 UTC m=+0.117926360 container init dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:13:29 compute-0 podman[78407]: 2025-12-01 09:13:29.117331383 +0000 UTC m=+0.023695544 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:29 compute-0 podman[78407]: 2025-12-01 09:13:29.217376982 +0000 UTC m=+0.123741123 container start dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:29 compute-0 podman[78407]: 2025-12-01 09:13:29.22034118 +0000 UTC m=+0.126705321 container attach dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:13:29 compute-0 podman[78467]: 2025-12-01 09:13:29.379431681 +0000 UTC m=+0.083098877 container create ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:13:29 compute-0 podman[78467]: 2025-12-01 09:13:29.316345379 +0000 UTC m=+0.020012595 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:29 compute-0 systemd[1]: Started libpod-conmon-ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202.scope.
Dec 01 09:13:29 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:29 compute-0 podman[78467]: 2025-12-01 09:13:29.481874411 +0000 UTC m=+0.185541647 container init ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:13:29 compute-0 podman[78467]: 2025-12-01 09:13:29.488362963 +0000 UTC m=+0.192030159 container start ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:13:29 compute-0 podman[78467]: 2025-12-01 09:13:29.491673122 +0000 UTC m=+0.195340438 container attach ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:29 compute-0 busy_rubin[78483]: 167 167
Dec 01 09:13:29 compute-0 systemd[1]: libpod-ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202.scope: Deactivated successfully.
Dec 01 09:13:29 compute-0 podman[78467]: 2025-12-01 09:13:29.49633973 +0000 UTC m=+0.200006916 container died ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:13:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-b2a129378f30c0619c656d87793c2ebf33008bb8fbeec51482add953933b6d3d-merged.mount: Deactivated successfully.
Dec 01 09:13:29 compute-0 podman[78467]: 2025-12-01 09:13:29.535782101 +0000 UTC m=+0.239449297 container remove ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_rubin, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:13:29 compute-0 systemd[1]: libpod-conmon-ccf06693f5842fff4de308a5f6933e8d57b2bbc6cded0adc06dbb32e6024f202.scope: Deactivated successfully.
Dec 01 09:13:29 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:29 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:29 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14166 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "compute-0", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec 01 09:13:29 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:29 compute-0 ceph-mgr[75324]: [cephadm INFO root] Added label _admin to host compute-0
Dec 01 09:13:29 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Added label _admin to host compute-0
Dec 01 09:13:29 compute-0 recursing_zhukovsky[78441]: Added label _admin to host compute-0
Dec 01 09:13:29 compute-0 systemd[1]: libpod-dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d.scope: Deactivated successfully.
Dec 01 09:13:29 compute-0 podman[78407]: 2025-12-01 09:13:29.843160612 +0000 UTC m=+0.749524753 container died dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-f18712edbbe4e21ac9262f112543cee5b176447de534cf232d886e12829dd089-merged.mount: Deactivated successfully.
Dec 01 09:13:29 compute-0 podman[78407]: 2025-12-01 09:13:29.938485591 +0000 UTC m=+0.844849732 container remove dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d (image=quay.io/ceph/ceph:v18, name=recursing_zhukovsky, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:13:29 compute-0 systemd[1]: libpod-conmon-dd65099a3dd6acd4df9ce26301ff3e40cd59f140ccd749bdee079f35b5866b0d.scope: Deactivated successfully.
Dec 01 09:13:30 compute-0 podman[78534]: 2025-12-01 09:13:30.069409697 +0000 UTC m=+0.111780059 container create 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:13:30 compute-0 podman[78534]: 2025-12-01 09:13:29.979684544 +0000 UTC m=+0.022054926 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:30 compute-0 systemd[1]: Started libpod-conmon-08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64.scope.
Dec 01 09:13:30 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:30 compute-0 podman[78534]: 2025-12-01 09:13:30.138201508 +0000 UTC m=+0.180571890 container init 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:30 compute-0 podman[78534]: 2025-12-01 09:13:30.144382241 +0000 UTC m=+0.186752603 container start 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:13:30 compute-0 podman[78534]: 2025-12-01 09:13:30.148066521 +0000 UTC m=+0.190436913 container attach 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:13:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1
Dec 01 09:13:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/729133558' entity='client.admin' 
Dec 01 09:13:30 compute-0 systemd[1]: libpod-08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64.scope: Deactivated successfully.
Dec 01 09:13:30 compute-0 podman[78534]: 2025-12-01 09:13:30.750662333 +0000 UTC m=+0.793032695 container died 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-c3770ec82946ea8cac3e6025ee0a2994907d5bd66be6f52c2009a25743afee56-merged.mount: Deactivated successfully.
Dec 01 09:13:30 compute-0 podman[78534]: 2025-12-01 09:13:30.790507555 +0000 UTC m=+0.832877917 container remove 08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64 (image=quay.io/ceph/ceph:v18, name=xenodochial_brattain, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:13:30 compute-0 systemd[1]: libpod-conmon-08d5cf2604a52bb6ac585c3ca00cc938e73052767218375c5dd617a42b4e4c64.scope: Deactivated successfully.
Dec 01 09:13:30 compute-0 ceph-mon[75031]: from='client.14164 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "label:_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:30 compute-0 ceph-mon[75031]: from='client.14166 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "compute-0", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:30 compute-0 ceph-mon[75031]: Added label _admin to host compute-0
Dec 01 09:13:30 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/729133558' entity='client.admin' 
Dec 01 09:13:30 compute-0 podman[78590]: 2025-12-01 09:13:30.866482649 +0000 UTC m=+0.057564669 container create a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:13:30 compute-0 systemd[1]: Started libpod-conmon-a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c.scope.
Dec 01 09:13:30 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:30 compute-0 podman[78590]: 2025-12-01 09:13:30.84124493 +0000 UTC m=+0.032327020 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:30 compute-0 podman[78590]: 2025-12-01 09:13:30.939933729 +0000 UTC m=+0.131015719 container init a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:30 compute-0 podman[78590]: 2025-12-01 09:13:30.945056121 +0000 UTC m=+0.136138111 container start a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:13:30 compute-0 podman[78590]: 2025-12-01 09:13:30.948248616 +0000 UTC m=+0.139330626 container attach a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:30 compute-0 ceph-mgr[75324]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 01 09:13:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1
Dec 01 09:13:31 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3188111268' entity='client.admin' 
Dec 01 09:13:31 compute-0 stupefied_feistel[78607]: set mgr/dashboard/cluster/status
Dec 01 09:13:31 compute-0 systemd[1]: libpod-a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c.scope: Deactivated successfully.
Dec 01 09:13:31 compute-0 podman[78590]: 2025-12-01 09:13:31.664709276 +0000 UTC m=+0.855791276 container died a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-b761eeb2c6a6db13713e4031d484c943271247c27b8d78e5eea9a9389140da44-merged.mount: Deactivated successfully.
Dec 01 09:13:31 compute-0 podman[78590]: 2025-12-01 09:13:31.706223418 +0000 UTC m=+0.897305408 container remove a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c (image=quay.io/ceph/ceph:v18, name=stupefied_feistel, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 09:13:31 compute-0 systemd[1]: libpod-conmon-a1edc7d2e0763bc6b695a6b72ecdb2932140776cd0fbd25db63ef6d8cd323a9c.scope: Deactivated successfully.
Dec 01 09:13:31 compute-0 sudo[74019]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:31 compute-0 podman[78654]: 2025-12-01 09:13:31.91552979 +0000 UTC m=+0.044991036 container create 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:31 compute-0 systemd[1]: Started libpod-conmon-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope.
Dec 01 09:13:31 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:31 compute-0 podman[78654]: 2025-12-01 09:13:31.981012554 +0000 UTC m=+0.110473780 container init 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:31 compute-0 podman[78654]: 2025-12-01 09:13:31.99066353 +0000 UTC m=+0.120124736 container start 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:13:31 compute-0 podman[78654]: 2025-12-01 09:13:31.897781834 +0000 UTC m=+0.027243060 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:31 compute-0 podman[78654]: 2025-12-01 09:13:31.994317269 +0000 UTC m=+0.123778585 container attach 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:32 compute-0 sudo[78699]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiumetpofjhzhcdzubjhqkkjpdwjzjvp ; /usr/bin/python3'
Dec 01 09:13:32 compute-0 sudo[78699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:32 compute-0 python3[78701]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set mgr mgr/cephadm/use_repo_digest false
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:32 compute-0 podman[78702]: 2025-12-01 09:13:32.264530447 +0000 UTC m=+0.050630643 container create 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:32 compute-0 systemd[1]: Started libpod-conmon-99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea.scope.
Dec 01 09:13:32 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6983d1ee366207ebadf544cff5285b0f5378145fefc093623813b904edb3e89/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6983d1ee366207ebadf544cff5285b0f5378145fefc093623813b904edb3e89/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:32 compute-0 podman[78702]: 2025-12-01 09:13:32.332572376 +0000 UTC m=+0.118672562 container init 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:32 compute-0 podman[78702]: 2025-12-01 09:13:32.245481742 +0000 UTC m=+0.031581948 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:32 compute-0 podman[78702]: 2025-12-01 09:13:32.341364297 +0000 UTC m=+0.127464493 container start 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:13:32 compute-0 podman[78702]: 2025-12-01 09:13:32.344699716 +0000 UTC m=+0.130799912 container attach 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:13:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3188111268' entity='client.admin' 
Dec 01 09:13:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1
Dec 01 09:13:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1972745556' entity='client.admin' 
Dec 01 09:13:32 compute-0 systemd[1]: libpod-99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea.scope: Deactivated successfully.
Dec 01 09:13:32 compute-0 podman[78702]: 2025-12-01 09:13:32.911040683 +0000 UTC m=+0.697140879 container died 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:13:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-d6983d1ee366207ebadf544cff5285b0f5378145fefc093623813b904edb3e89-merged.mount: Deactivated successfully.
Dec 01 09:13:32 compute-0 ceph-mgr[75324]: mgr.server send_report Giving up on OSDs that haven't reported yet, sending potentially incomplete PG state to mon
Dec 01 09:13:32 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:32 compute-0 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check failed: OSD count 0 < osd_pool_default_size 1 (TOO_FEW_OSDS)
Dec 01 09:13:32 compute-0 podman[78702]: 2025-12-01 09:13:32.956138861 +0000 UTC m=+0.742239057 container remove 99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea (image=quay.io/ceph/ceph:v18, name=optimistic_cray, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:13:32 compute-0 systemd[1]: libpod-conmon-99e4a58ec4cb8dce2fdc06c5c649cf56cf6cda25fb2d71c5582049fc704349ea.scope: Deactivated successfully.
Dec 01 09:13:32 compute-0 sudo[78699]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:33 compute-0 beautiful_jang[78671]: [
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:     {
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "available": false,
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "ceph_device": false,
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "lsm_data": {},
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "lvs": [],
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "path": "/dev/sr0",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "rejected_reasons": [
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "Has a FileSystem",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "Insufficient space (<5GB)"
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         ],
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         "sys_api": {
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "actuators": null,
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "device_nodes": "sr0",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "devname": "sr0",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "human_readable_size": "482.00 KB",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "id_bus": "ata",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "model": "QEMU DVD-ROM",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "nr_requests": "2",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "parent": "/dev/sr0",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "partitions": {},
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "path": "/dev/sr0",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "removable": "1",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "rev": "2.5+",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "ro": "0",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "rotational": "1",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "sas_address": "",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "sas_device_handle": "",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "scheduler_mode": "mq-deadline",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "sectors": 0,
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "sectorsize": "2048",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "size": 493568.0,
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "support_discard": "2048",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "type": "disk",
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:             "vendor": "QEMU"
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:         }
Dec 01 09:13:33 compute-0 beautiful_jang[78671]:     }
Dec 01 09:13:33 compute-0 beautiful_jang[78671]: ]
Dec 01 09:13:33 compute-0 systemd[1]: libpod-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope: Deactivated successfully.
Dec 01 09:13:33 compute-0 podman[78654]: 2025-12-01 09:13:33.345518536 +0000 UTC m=+1.474979772 container died 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:33 compute-0 systemd[1]: libpod-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope: Consumed 1.382s CPU time.
Dec 01 09:13:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-10d2f33fd1b0c2fe7fb20b41c94ea34e3bd0c096ee73f6aeadddfa08650a78c1-merged.mount: Deactivated successfully.
Dec 01 09:13:33 compute-0 podman[78654]: 2025-12-01 09:13:33.400484967 +0000 UTC m=+1.529946183 container remove 1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_jang, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:33 compute-0 systemd[1]: libpod-conmon-1903bfb07306007cf7113c51609d1ccf343f3142a879364f18445ab07bcaa199.scope: Deactivated successfully.
Dec 01 09:13:33 compute-0 sudo[78368]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:13:33 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:33 compute-0 sudo[80456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiwnptdxxgtvvlhzhtdskmdmjbmwpmsg ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764580413.3065505-36368-230320539451562/async_wrapper.py j466498949925 30 /home/zuul/.ansible/tmp/ansible-tmp-1764580413.3065505-36368-230320539451562/AnsiballZ_command.py _'
Dec 01 09:13:33 compute-0 sudo[80456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:33 compute-0 ansible-async_wrapper.py[80458]: Invoked with j466498949925 30 /home/zuul/.ansible/tmp/ansible-tmp-1764580413.3065505-36368-230320539451562/AnsiballZ_command.py _
Dec 01 09:13:33 compute-0 ansible-async_wrapper.py[80461]: Starting module and watcher
Dec 01 09:13:33 compute-0 ansible-async_wrapper.py[80461]: Start watching 80462 (30)
Dec 01 09:13:33 compute-0 ansible-async_wrapper.py[80462]: Start module (80462)
Dec 01 09:13:33 compute-0 ansible-async_wrapper.py[80458]: Return async_wrapper task started.
Dec 01 09:13:33 compute-0 sudo[80456]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:33 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:13:34 compute-0 python3[80463]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:34 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1972745556' entity='client.admin' 
Dec 01 09:13:34 compute-0 ceph-mon[75031]: Health check failed: OSD count 0 < osd_pool_default_size 1 (TOO_FEW_OSDS)
Dec 01 09:13:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:34 compute-0 podman[80464]: 2025-12-01 09:13:34.217122572 +0000 UTC m=+0.106162222 container create 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:13:34 compute-0 podman[80464]: 2025-12-01 09:13:34.140827237 +0000 UTC m=+0.029866897 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:34 compute-0 systemd[1]: Started libpod-conmon-431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7.scope.
Dec 01 09:13:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca6df8fd25eab84d8dbfff5186db2a61eb7ccbfb0ca0820342e9a0a09a305aa2/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca6df8fd25eab84d8dbfff5186db2a61eb7ccbfb0ca0820342e9a0a09a305aa2/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0) v1
Dec 01 09:13:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:13:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:13:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:13:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:13:34 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/etc/ceph/ceph.conf
Dec 01 09:13:34 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/etc/ceph/ceph.conf
Dec 01 09:13:34 compute-0 podman[80464]: 2025-12-01 09:13:34.719911421 +0000 UTC m=+0.608951091 container init 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:34 compute-0 sudo[80482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:34 compute-0 sudo[80482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:34 compute-0 podman[80464]: 2025-12-01 09:13:34.73133859 +0000 UTC m=+0.620378240 container start 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:34 compute-0 sudo[80482]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:34 compute-0 podman[80464]: 2025-12-01 09:13:34.735078111 +0000 UTC m=+0.624117811 container attach 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:34 compute-0 sudo[80508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:13:34 compute-0 sudo[80508]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:34 compute-0 sudo[80508]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:34 compute-0 sudo[80533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:34 compute-0 sudo[80533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:34 compute-0 sudo[80533]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:34 compute-0 sudo[80558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph
Dec 01 09:13:34 compute-0 sudo[80558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:34 compute-0 sudo[80558]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:34 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:34 compute-0 sudo[80583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:34 compute-0 sudo[80583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:34 compute-0 sudo[80583]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.conf.new
Dec 01 09:13:35 compute-0 sudo[80613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80613]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:35 compute-0 sudo[80674]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80674]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 ceph-mon[75031]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:35 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:35 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:35 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:35 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:13:35 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:35 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:13:35 compute-0 ceph-mon[75031]: Updating compute-0:/etc/ceph/ceph.conf
Dec 01 09:13:35 compute-0 sudo[80700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:35 compute-0 sudo[80700]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80700]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:35 compute-0 sudo[80725]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80725]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kushwdwetjfnjyuxrrqoufhufgezwkhp ; /usr/bin/python3'
Dec 01 09:13:35 compute-0 sudo[80778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:35 compute-0 sudo[80771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.conf.new
Dec 01 09:13:35 compute-0 sudo[80771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80771]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14174 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:13:35 compute-0 friendly_swirles[80479]: 
Dec 01 09:13:35 compute-0 friendly_swirles[80479]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 01 09:13:35 compute-0 systemd[1]: libpod-431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7.scope: Deactivated successfully.
Dec 01 09:13:35 compute-0 podman[80464]: 2025-12-01 09:13:35.306509589 +0000 UTC m=+1.195549239 container died 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:13:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca6df8fd25eab84d8dbfff5186db2a61eb7ccbfb0ca0820342e9a0a09a305aa2-merged.mount: Deactivated successfully.
Dec 01 09:13:35 compute-0 podman[80464]: 2025-12-01 09:13:35.348171615 +0000 UTC m=+1.237211265 container remove 431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7 (image=quay.io/ceph/ceph:v18, name=friendly_swirles, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:13:35 compute-0 systemd[1]: libpod-conmon-431a01ed72da3ff38b765a0c2fb5cae8f8485e4b52a81f427c550bddafae42a7.scope: Deactivated successfully.
Dec 01 09:13:35 compute-0 ansible-async_wrapper.py[80462]: Module complete (80462)
Dec 01 09:13:35 compute-0 sudo[80837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:35 compute-0 sudo[80837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80837]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 python3[80794]: ansible-ansible.legacy.async_status Invoked with jid=j466498949925.80458 mode=status _async_dir=/root/.ansible_async
Dec 01 09:13:35 compute-0 sudo[80778]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.conf.new
Dec 01 09:13:35 compute-0 sudo[80862]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80862]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:35 compute-0 sudo[80899]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80899]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80965]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mndwadqxmtactheyvnrmkwdsswjontgu ; /usr/bin/python3'
Dec 01 09:13:35 compute-0 sudo[80965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:35 compute-0 sudo[80950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.conf.new
Dec 01 09:13:35 compute-0 sudo[80950]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80950]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:35 compute-0 sudo[80986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[80986]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 python3[80983]: ansible-ansible.legacy.async_status Invoked with jid=j466498949925.80458 mode=cleanup _async_dir=/root/.ansible_async
Dec 01 09:13:35 compute-0 sudo[81011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 01 09:13:35 compute-0 sudo[81011]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[81011]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[80965]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf
Dec 01 09:13:35 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf
Dec 01 09:13:35 compute-0 sudo[81036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:35 compute-0 sudo[81036]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[81036]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[81061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config
Dec 01 09:13:35 compute-0 sudo[81061]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[81061]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[81086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:35 compute-0 sudo[81086]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[81086]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 compute-0 sudo[81111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config
Dec 01 09:13:35 compute-0 sudo[81111]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:35 compute-0 sudo[81111]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:36 compute-0 sudo[81180]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvrimsatoiaxooyaivhibxiccxbdsfin ; /usr/bin/python3'
Dec 01 09:13:36 compute-0 sudo[81180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:36 compute-0 sudo[81139]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81139]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf.new
Dec 01 09:13:36 compute-0 sudo[81187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81187]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:36 compute-0 sudo[81212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81212]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 python3[81185]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/specs/ceph_spec.yaml follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:13:36 compute-0 ceph-mon[75031]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:36 compute-0 ceph-mon[75031]: from='client.14174 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:13:36 compute-0 ceph-mon[75031]: Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf
Dec 01 09:13:36 compute-0 sudo[81180]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:36 compute-0 sudo[81237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81237]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:36 compute-0 sudo[81264]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81264]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf.new
Dec 01 09:13:36 compute-0 sudo[81289]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81289]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:36 compute-0 sudo[81337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81337]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81394]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfgsvmzjogyiiqqndcmmdmanxnlixrzj ; /usr/bin/python3'
Dec 01 09:13:36 compute-0 sudo[81394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:36 compute-0 sudo[81374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf.new
Dec 01 09:13:36 compute-0 sudo[81374]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81374]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 sudo[81413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:36 compute-0 sudo[81413]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81413]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 python3[81410]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:36 compute-0 sudo[81438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf.new
Dec 01 09:13:36 compute-0 sudo[81438]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81438]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 podman[81455]: 2025-12-01 09:13:36.750558682 +0000 UTC m=+0.081219681 container create b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:13:36 compute-0 sudo[81474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:36 compute-0 sudo[81474]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81474]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 podman[81455]: 2025-12-01 09:13:36.696881309 +0000 UTC m=+0.027542328 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:36 compute-0 systemd[1]: Started libpod-conmon-b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e.scope.
Dec 01 09:13:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:36 compute-0 sudo[81500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf.new /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.conf
Dec 01 09:13:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:36 compute-0 sudo[81500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81500]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:13:36 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:13:36 compute-0 podman[81455]: 2025-12-01 09:13:36.887437554 +0000 UTC m=+0.218098573 container init b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:36 compute-0 podman[81455]: 2025-12-01 09:13:36.893675869 +0000 UTC m=+0.224336868 container start b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:13:36 compute-0 podman[81455]: 2025-12-01 09:13:36.900634626 +0000 UTC m=+0.231295625 container attach b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:13:36 compute-0 sudo[81532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:36 compute-0 sudo[81532]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81532]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:36 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:36 compute-0 sudo[81558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:13:36 compute-0 sudo[81558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:36 compute-0 sudo[81558]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:37 compute-0 sudo[81583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81583]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph
Dec 01 09:13:37 compute-0 sudo[81608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81608]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 ceph-mon[75031]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:13:37 compute-0 sudo[81633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:37 compute-0 sudo[81633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81633]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:13:37 compute-0 sudo[81658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81658]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:37 compute-0 sudo[81693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81693]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:37 compute-0 sudo[81727]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81727]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:37 compute-0 sudo[81752]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81752]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:13:37 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14176 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:13:37 compute-0 recursing_albattani[81527]: 
Dec 01 09:13:37 compute-0 recursing_albattani[81527]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 01 09:13:37 compute-0 sudo[81777]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81777]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 systemd[1]: libpod-b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e.scope: Deactivated successfully.
Dec 01 09:13:37 compute-0 podman[81455]: 2025-12-01 09:13:37.512111192 +0000 UTC m=+0.842772191 container died b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:13:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-29c84533923f6c1f5f9c8b792aced0a8831f0c78cc76d6133f2fd02c9d28aa8a-merged.mount: Deactivated successfully.
Dec 01 09:13:37 compute-0 podman[81455]: 2025-12-01 09:13:37.617830059 +0000 UTC m=+0.948491058 container remove b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e (image=quay.io/ceph/ceph:v18, name=recursing_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:37 compute-0 systemd[1]: libpod-conmon-b057465fd318b1a0308895210a8a44417e7844d697a76f48e7985b2dfb23391e.scope: Deactivated successfully.
Dec 01 09:13:37 compute-0 sudo[81838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:37 compute-0 sudo[81838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81838]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81394]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:13:37 compute-0 sudo[81866]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81866]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:37 compute-0 sudo[81891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81891]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:13:37 compute-0 sudo[81916]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81916]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:37 compute-0 sudo[81941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81941]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 sudo[81990]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqwcouffhhrqczyqroaqiaoupxaxqhht ; /usr/bin/python3'
Dec 01 09:13:37 compute-0 sudo[81990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:37 compute-0 sudo[81989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 01 09:13:37 compute-0 sudo[81989]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:37 compute-0 sudo[81989]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:37 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring
Dec 01 09:13:37 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring
Dec 01 09:13:38 compute-0 sudo[82017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:38 compute-0 sudo[82017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82017]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 python3[82005]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set global log_to_file true _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:38 compute-0 sudo[82042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config
Dec 01 09:13:38 compute-0 sudo[82042]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82042]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 podman[82055]: 2025-12-01 09:13:38.107100968 +0000 UTC m=+0.040876014 container create dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:38 compute-0 systemd[1]: Started libpod-conmon-dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3.scope.
Dec 01 09:13:38 compute-0 sudo[82080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:38 compute-0 sudo[82080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82080]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:38 compute-0 podman[82055]: 2025-12-01 09:13:38.089970659 +0000 UTC m=+0.023745725 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:38 compute-0 podman[82055]: 2025-12-01 09:13:38.212592108 +0000 UTC m=+0.146367174 container init dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:38 compute-0 podman[82055]: 2025-12-01 09:13:38.222182613 +0000 UTC m=+0.155957659 container start dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:13:38 compute-0 sudo[82110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config
Dec 01 09:13:38 compute-0 sudo[82110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82110]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 ceph-mon[75031]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:38 compute-0 ceph-mon[75031]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:13:38 compute-0 podman[82055]: 2025-12-01 09:13:38.264857319 +0000 UTC m=+0.198632385 container attach dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:38 compute-0 sudo[82136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:38 compute-0 sudo[82136]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82136]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 sudo[82161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring.new
Dec 01 09:13:38 compute-0 sudo[82161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82161]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 sudo[82186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:38 compute-0 sudo[82186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82186]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 sudo[82211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:38 compute-0 sudo[82211]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82211]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 sudo[82236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:38 compute-0 sudo[82236]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82236]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 sudo[82262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring.new
Dec 01 09:13:38 compute-0 sudo[82262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82262]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 sudo[82328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:38 compute-0 sudo[82328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82328]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=log_to_file}] v 0) v1
Dec 01 09:13:38 compute-0 sudo[82353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring.new
Dec 01 09:13:38 compute-0 sudo[82353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2416352962' entity='client.admin' 
Dec 01 09:13:38 compute-0 sudo[82353]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 systemd[1]: libpod-dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3.scope: Deactivated successfully.
Dec 01 09:13:38 compute-0 podman[82055]: 2025-12-01 09:13:38.836939746 +0000 UTC m=+0.770714792 container died dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-cb02d7ccbdb527d7a6b098b61835ba4e3f9adfa4f9f7758f30a636b30e3db9f9-merged.mount: Deactivated successfully.
Dec 01 09:13:38 compute-0 sudo[82380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:38 compute-0 podman[82055]: 2025-12-01 09:13:38.893113063 +0000 UTC m=+0.826888109 container remove dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3 (image=quay.io/ceph/ceph:v18, name=upbeat_kepler, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:38 compute-0 sudo[82380]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82380]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 systemd[1]: libpod-conmon-dbb1ff21f533cdecdb1940d85c713370b32ba6e2e15dc024e260f56ee960fea3.scope: Deactivated successfully.
Dec 01 09:13:38 compute-0 sudo[81990]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 compute-0 ansible-async_wrapper.py[80461]: Done in kid B.
Dec 01 09:13:38 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:38 compute-0 sudo[82419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring.new
Dec 01 09:13:38 compute-0 sudo[82419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:38 compute-0 sudo[82419]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:39 compute-0 sudo[82444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:39 compute-0 sudo[82444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:39 compute-0 sudo[82444]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:39 compute-0 sudo[82491]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-johevdhxpttiseeoazlnjzjswmqoexox ; /usr/bin/python3'
Dec 01 09:13:39 compute-0 sudo[82491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:39 compute-0 sudo[82493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5620a9fb-e540-5250-a0e8-7aaad5347e3b/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring.new /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring
Dec 01 09:13:39 compute-0 sudo[82493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:39 compute-0 sudo[82493]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:13:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:13:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:39 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev c5abd3f8-9653-4f8e-81e4-4aa8afc043ca (Updating crash deployment (+1 -> 1))
Dec 01 09:13:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) v1
Dec 01 09:13:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 01 09:13:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec 01 09:13:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:13:39 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:39 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon crash.compute-0 on compute-0
Dec 01 09:13:39 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon crash.compute-0 on compute-0
Dec 01 09:13:39 compute-0 sudo[82520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:39 compute-0 sudo[82520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:39 compute-0 sudo[82520]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:39 compute-0 python3[82498]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set global mon_cluster_log_to_file true _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:39 compute-0 sudo[82545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:39 compute-0 sudo[82545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:39 compute-0 sudo[82545]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:39 compute-0 podman[82548]: 2025-12-01 09:13:39.276121239 +0000 UTC m=+0.051305634 container create 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec 01 09:13:39 compute-0 systemd[1]: Started libpod-conmon-58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189.scope.
Dec 01 09:13:39 compute-0 podman[82548]: 2025-12-01 09:13:39.251714505 +0000 UTC m=+0.026898900 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:39 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:39 compute-0 sudo[82583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:39 compute-0 sudo[82583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:39 compute-0 sudo[82583]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:39 compute-0 podman[82548]: 2025-12-01 09:13:39.37017936 +0000 UTC m=+0.145363765 container init 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:13:39 compute-0 podman[82548]: 2025-12-01 09:13:39.381728103 +0000 UTC m=+0.156912488 container start 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:13:39 compute-0 podman[82548]: 2025-12-01 09:13:39.389514764 +0000 UTC m=+0.164699249 container attach 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:13:39 compute-0 sudo[82613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:39 compute-0 sudo[82613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:39 compute-0 podman[82691]: 2025-12-01 09:13:39.813782674 +0000 UTC m=+0.047879651 container create 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:13:39 compute-0 ceph-mon[75031]: Updating compute-0:/var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/config/ceph.client.admin.keyring
Dec 01 09:13:39 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2416352962' entity='client.admin' 
Dec 01 09:13:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 01 09:13:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec 01 09:13:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:39 compute-0 systemd[1]: Started libpod-conmon-863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed.scope.
Dec 01 09:13:39 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:39 compute-0 podman[82691]: 2025-12-01 09:13:39.880810754 +0000 UTC m=+0.114907761 container init 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:39 compute-0 podman[82691]: 2025-12-01 09:13:39.885483892 +0000 UTC m=+0.119580869 container start 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:13:39 compute-0 objective_beaver[82715]: 167 167
Dec 01 09:13:39 compute-0 podman[82691]: 2025-12-01 09:13:39.794251915 +0000 UTC m=+0.028348912 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:39 compute-0 systemd[1]: libpod-863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed.scope: Deactivated successfully.
Dec 01 09:13:39 compute-0 podman[82691]: 2025-12-01 09:13:39.903403054 +0000 UTC m=+0.137500031 container attach 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 01 09:13:39 compute-0 podman[82691]: 2025-12-01 09:13:39.903854937 +0000 UTC m=+0.137951914 container died 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-f97932927680a395270ee6f38245217f3c971a29c5dea2216bcb4485ca36a553-merged.mount: Deactivated successfully.
Dec 01 09:13:39 compute-0 podman[82691]: 2025-12-01 09:13:39.943601867 +0000 UTC m=+0.177698834 container remove 863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_beaver, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:13:39 compute-0 systemd[1]: libpod-conmon-863c98d2158f18e2517903963c1b438f267f43a4780521fef6c70d5d6fecf2ed.scope: Deactivated successfully.
Dec 01 09:13:39 compute-0 systemd[1]: Reloading.
Dec 01 09:13:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mon_cluster_log_to_file}] v 0) v1
Dec 01 09:13:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/592707641' entity='client.admin' 
Dec 01 09:13:40 compute-0 podman[82548]: 2025-12-01 09:13:40.03872159 +0000 UTC m=+0.813905975 container died 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:40 compute-0 systemd-sysv-generator[82780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:13:40 compute-0 systemd-rc-local-generator[82776]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:13:40 compute-0 systemd[1]: libpod-58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189.scope: Deactivated successfully.
Dec 01 09:13:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-7e53a8e5acef89a6971d5bb616e16281af2a9a02b114718d991e587846bcd9ea-merged.mount: Deactivated successfully.
Dec 01 09:13:40 compute-0 podman[82548]: 2025-12-01 09:13:40.289646766 +0000 UTC m=+1.064831151 container remove 58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189 (image=quay.io/ceph/ceph:v18, name=sleepy_liskov, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:40 compute-0 systemd[1]: libpod-conmon-58ebde7ce75b37d643c4318773c3a5fd5ef3750e556d1aca35b38312ce0ce189.scope: Deactivated successfully.
Dec 01 09:13:40 compute-0 sudo[82491]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:40 compute-0 systemd[1]: Reloading.
Dec 01 09:13:40 compute-0 systemd-sysv-generator[82819]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:13:40 compute-0 systemd-rc-local-generator[82814]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:13:40 compute-0 sudo[82848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yodaxgvoqdtohrbscfaqthqfluiordjc ; /usr/bin/python3'
Dec 01 09:13:40 compute-0 sudo[82848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:40 compute-0 systemd[1]: Starting Ceph crash.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:13:40 compute-0 python3[82852]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd set-require-min-compat-client mimic
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:40 compute-0 podman[82878]: 2025-12-01 09:13:40.786096649 +0000 UTC m=+0.052315434 container create 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:13:40 compute-0 systemd[1]: Started libpod-conmon-3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50.scope.
Dec 01 09:13:40 compute-0 ceph-mon[75031]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:40 compute-0 ceph-mon[75031]: Deploying daemon crash.compute-0 on compute-0
Dec 01 09:13:40 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/592707641' entity='client.admin' 
Dec 01 09:13:40 compute-0 podman[82878]: 2025-12-01 09:13:40.763542899 +0000 UTC m=+0.029761704 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:40 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:40 compute-0 podman[82915]: 2025-12-01 09:13:40.864702491 +0000 UTC m=+0.054522179 container create 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:40 compute-0 podman[82878]: 2025-12-01 09:13:40.885643503 +0000 UTC m=+0.151862308 container init 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 01 09:13:40 compute-0 podman[82878]: 2025-12-01 09:13:40.895018711 +0000 UTC m=+0.161237496 container start 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:13:40 compute-0 podman[82878]: 2025-12-01 09:13:40.899151054 +0000 UTC m=+0.165370129 container attach 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/etc/ceph/ceph.client.crash.compute-0.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2cec797eb7bef878699279aa8ee94afbcebe6e9e58528317e8ac321c09be38/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:40 compute-0 podman[82915]: 2025-12-01 09:13:40.928280258 +0000 UTC m=+0.118099966 container init 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:40 compute-0 podman[82915]: 2025-12-01 09:13:40.838438822 +0000 UTC m=+0.028258560 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:40 compute-0 podman[82915]: 2025-12-01 09:13:40.935115651 +0000 UTC m=+0.124935339 container start 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:40 compute-0 bash[82915]: 83d60e6b432ce4cbd9a76d9ee4c24e49cbc1130ab3ceccbddaa8851b48170ec1
Dec 01 09:13:40 compute-0 systemd[1]: Started Ceph crash.compute-0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:13:40 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:40 compute-0 sudo[82613]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:13:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:41 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev c5abd3f8-9653-4f8e-81e4-4aa8afc043ca (Updating crash deployment (+1 -> 1))
Dec 01 09:13:41 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event c5abd3f8-9653-4f8e-81e4-4aa8afc043ca (Updating crash deployment (+1 -> 1)) in 2 seconds
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:41 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev b641bac5-9918-45ef-846f-b436360b0fe4 does not exist
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:41 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev 0ad1d74c-45e7-464b-841d-9ea23a988291 (Updating mgr deployment (+1 -> 2))
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:41 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon mgr.compute-0.htextg on compute-0
Dec 01 09:13:41 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon mgr.compute-0.htextg on compute-0
Dec 01 09:13:41 compute-0 sudo[82941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:41 compute-0 sudo[82941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:41 compute-0 sudo[82941]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: INFO:ceph-crash:pinging cluster to exercise our key
Dec 01 09:13:41 compute-0 sudo[82966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:41 compute-0 sudo[82966]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:41 compute-0 sudo[82966]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:41 compute-0 sudo[82993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:41 compute-0 sudo[82993]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:41 compute-0 sudo[82993]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:41 compute-0 sudo[83037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:13:41 compute-0 sudo[83037]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.359+0000 7f28ddf40640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.359+0000 7f28ddf40640 -1 AuthRegistry(0x7f28d8067440) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.360+0000 7f28ddf40640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.360+0000 7f28ddf40640 -1 AuthRegistry(0x7f28ddf3f000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.362+0000 7f28d77fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: 2025-12-01T09:13:41.362+0000 7f28ddf40640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec 01 09:13:41 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-crash-compute-0[82936]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec 01 09:13:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd set-require-min-compat-client", "version": "mimic"} v 0) v1
Dec 01 09:13:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd=[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]: dispatch
Dec 01 09:13:41 compute-0 podman[83113]: 2025-12-01 09:13:41.646531552 +0000 UTC m=+0.041915755 container create f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:13:41 compute-0 systemd[1]: Started libpod-conmon-f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c.scope.
Dec 01 09:13:41 compute-0 podman[83113]: 2025-12-01 09:13:41.627419465 +0000 UTC m=+0.022803688 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:41 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:41 compute-0 podman[83113]: 2025-12-01 09:13:41.749957181 +0000 UTC m=+0.145341404 container init f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:41 compute-0 podman[83113]: 2025-12-01 09:13:41.75631872 +0000 UTC m=+0.151702923 container start f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:41 compute-0 podman[83113]: 2025-12-01 09:13:41.75935261 +0000 UTC m=+0.154736833 container attach f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:41 compute-0 elastic_joliot[83130]: 167 167
Dec 01 09:13:41 compute-0 systemd[1]: libpod-f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c.scope: Deactivated successfully.
Dec 01 09:13:41 compute-0 podman[83113]: 2025-12-01 09:13:41.763901515 +0000 UTC m=+0.159285738 container died f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:13:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-4959ada2c90cbf80f64e33b7ae0a84d1f46a8b734b633881957146213b6c9b8e-merged.mount: Deactivated successfully.
Dec 01 09:13:41 compute-0 podman[83113]: 2025-12-01 09:13:41.838400385 +0000 UTC m=+0.233784588 container remove f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_joliot, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:13:41 compute-0 systemd[1]: libpod-conmon-f5e9eb984a3b04bbab8bd18f273387b83d4f4877ae6199b0a3440405eb3c2d9c.scope: Deactivated successfully.
Dec 01 09:13:41 compute-0 systemd[1]: Reloading.
Dec 01 09:13:41 compute-0 systemd-rc-local-generator[83170]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:13:41 compute-0 systemd-sysv-generator[83175]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:13:42 compute-0 ceph-mon[75031]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.htextg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:42 compute-0 ceph-mon[75031]: Deploying daemon mgr.compute-0.htextg on compute-0
Dec 01 09:13:42 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd=[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]: dispatch
Dec 01 09:13:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 do_prune osdmap full prune enabled
Dec 01 09:13:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e2 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:13:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd='[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]': finished
Dec 01 09:13:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 e3: 0 total, 0 up, 0 in
Dec 01 09:13:42 compute-0 gallant_bassi[82927]: set require_min_compat_client to mimic
Dec 01 09:13:42 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e3: 0 total, 0 up, 0 in
Dec 01 09:13:42 compute-0 podman[82878]: 2025-12-01 09:13:42.065119844 +0000 UTC m=+1.331338649 container died 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:42 compute-0 systemd[1]: libpod-3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50.scope: Deactivated successfully.
Dec 01 09:13:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c948808cf65ac18d9238ee6ad166ab88d63fb9149c76040ce59b26af013995e-merged.mount: Deactivated successfully.
Dec 01 09:13:42 compute-0 podman[82878]: 2025-12-01 09:13:42.197804071 +0000 UTC m=+1.464022856 container remove 3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50 (image=quay.io/ceph/ceph:v18, name=gallant_bassi, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:13:42 compute-0 systemd[1]: libpod-conmon-3780f2718563d335333a752b335d6890d0dadc2960719618607e0f21615a9a50.scope: Deactivated successfully.
Dec 01 09:13:42 compute-0 sudo[82848]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:42 compute-0 systemd[1]: Reloading.
Dec 01 09:13:42 compute-0 systemd-rc-local-generator[83230]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:13:42 compute-0 systemd-sysv-generator[83234]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:13:42 compute-0 systemd[1]: Starting Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:13:42 compute-0 sudo[83298]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mulcpmpbzdxpxrcwbppfdkpytavzvupk ; /usr/bin/python3'
Dec 01 09:13:42 compute-0 sudo[83298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:42 compute-0 podman[83314]: 2025-12-01 09:13:42.744423922 +0000 UTC m=+0.051681724 container create 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Dec 01 09:13:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46/merged/var/lib/ceph/mgr/ceph-compute-0.htextg supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:42 compute-0 podman[83314]: 2025-12-01 09:13:42.814054349 +0000 UTC m=+0.121312171 container init 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:42 compute-0 python3[83308]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch apply --in-file /home/ceph_spec.yaml _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:42 compute-0 podman[83314]: 2025-12-01 09:13:42.820271063 +0000 UTC m=+0.127528865 container start 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:42 compute-0 podman[83314]: 2025-12-01 09:13:42.72647413 +0000 UTC m=+0.033731962 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:42 compute-0 bash[83314]: 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce
Dec 01 09:13:42 compute-0 systemd[1]: Started Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:13:42 compute-0 ceph-mgr[83335]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:13:42 compute-0 ceph-mgr[83335]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Dec 01 09:13:42 compute-0 ceph-mgr[83335]: pidfile_write: ignore empty --pid-file
Dec 01 09:13:42 compute-0 podman[83334]: 2025-12-01 09:13:42.922430985 +0000 UTC m=+0.082055296 container create 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:42 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:42 compute-0 sudo[83037]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:13:42 compute-0 podman[83334]: 2025-12-01 09:13:42.871792992 +0000 UTC m=+0.031417333 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:42 compute-0 systemd[1]: Started libpod-conmon-0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f.scope.
Dec 01 09:13:42 compute-0 ceph-mgr[83335]: mgr[py] Loading python module 'alerts'
Dec 01 09:13:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:43 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [progress INFO root] Writing back 1 completed events
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:13:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:13:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:43 compute-0 podman[83334]: 2025-12-01 09:13:43.054918317 +0000 UTC m=+0.214542678 container init 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev 0ad1d74c-45e7-464b-841d-9ea23a988291 (Updating mgr deployment (+1 -> 2))
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 0ad1d74c-45e7-464b-841d-9ea23a988291 (Updating mgr deployment (+1 -> 2)) in 2 seconds
Dec 01 09:13:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec 01 09:13:43 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3480666797' entity='client.admin' cmd='[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]': finished
Dec 01 09:13:43 compute-0 ceph-mon[75031]: osdmap e3: 0 total, 0 up, 0 in
Dec 01 09:13:43 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:43 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:43 compute-0 podman[83334]: 2025-12-01 09:13:43.063961165 +0000 UTC m=+0.223585476 container start 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:13:43 compute-0 podman[83334]: 2025-12-01 09:13:43.067857641 +0000 UTC m=+0.227481962 container attach 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:13:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:43 compute-0 sudo[83379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:43 compute-0 sudo[83379]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 sudo[83379]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:43 compute-0 sudo[83404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:13:43 compute-0 sudo[83404]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 sudo[83404]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:43 compute-0 sudo[83429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:43 compute-0 sudo[83429]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 sudo[83429]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:43 compute-0 ceph-mgr[83335]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:13:43 compute-0 ceph-mgr[83335]: mgr[py] Loading python module 'balancer'
Dec 01 09:13:43 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg[83330]: 2025-12-01T09:13:43.411+0000 7fc783261140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:13:43 compute-0 sudo[83454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:43 compute-0 sudo[83454]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 sudo[83454]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:43 compute-0 sudo[83498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:43 compute-0 sudo[83498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 sudo[83498]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:43 compute-0 sudo[83523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:13:43 compute-0 sudo[83523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 ceph-mgr[83335]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:13:43 compute-0 ceph-mgr[83335]: mgr[py] Loading python module 'cephadm'
Dec 01 09:13:43 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg[83330]: 2025-12-01T09:13:43.729+0000 7fc783261140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:13:43 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14186 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:43 compute-0 sudo[83556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:43 compute-0 sudo[83556]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 sudo[83556]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:43 compute-0 sudo[83598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:43 compute-0 sudo[83598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:43 compute-0 sudo[83598]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 sudo[83639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:44 compute-0 sudo[83639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:44 compute-0 sudo[83639]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 ceph-mon[75031]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:44 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: from='client.14186 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:13:44 compute-0 sudo[83676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host --expect-hostname compute-0
Dec 01 09:13:44 compute-0 sudo[83676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:44 compute-0 podman[83712]: 2025-12-01 09:13:44.162046511 +0000 UTC m=+0.059143406 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:44 compute-0 podman[83712]: 2025-12-01 09:13:44.285082563 +0000 UTC m=+0.182179448 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:13:44 compute-0 sudo[83676]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [cephadm INFO root] Added host compute-0
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Added host compute-0
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service mon spec with placement compute-0
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mon spec with placement compute-0
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service mgr spec with placement compute-0
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mgr spec with placement compute-0
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [cephadm INFO root] Marking host: compute-0 for OSDSpec preview refresh.
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Marking host: compute-0 for OSDSpec preview refresh.
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service osd.default_drive_group spec with placement compute-0
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service osd.default_drive_group spec with placement compute-0
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.osd.default_drive_group}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 recursing_euclid[83375]: Added host 'compute-0' with addr '192.168.122.100'
Dec 01 09:13:44 compute-0 recursing_euclid[83375]: Scheduled mon update...
Dec 01 09:13:44 compute-0 recursing_euclid[83375]: Scheduled mgr update...
Dec 01 09:13:44 compute-0 recursing_euclid[83375]: Scheduled osd.default_drive_group update...
Dec 01 09:13:44 compute-0 systemd[1]: libpod-0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f.scope: Deactivated successfully.
Dec 01 09:13:44 compute-0 podman[83334]: 2025-12-01 09:13:44.438617329 +0000 UTC m=+1.598241660 container died 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:13:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-781a1fb8e63b0412ffeeb441cad9f2807ea13a9cc0ba03a7d3f142e2768a3e86-merged.mount: Deactivated successfully.
Dec 01 09:13:44 compute-0 podman[83334]: 2025-12-01 09:13:44.515110569 +0000 UTC m=+1.674734880 container remove 0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f (image=quay.io/ceph/ceph:v18, name=recursing_euclid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:13:44 compute-0 systemd[1]: libpod-conmon-0dc64f2d4b4092a1c2cf5ee4c84725c16bd553ea24006da5dd50358a5e69315f.scope: Deactivated successfully.
Dec 01 09:13:44 compute-0 sudo[83298]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 sudo[83523]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 52a4df2c-eef2-44c2-ae44-71aaae30c143 does not exist
Dec 01 09:13:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) v1
Dec 01 09:13:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev a0021272-0d49-43fd-b7af-09d21e2bb5e5 (Updating mgr deployment (-1 -> 1))
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Removing daemon mgr.compute-0.htextg from compute-0 -- ports [8765]
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Removing daemon mgr.compute-0.htextg from compute-0 -- ports [8765]
Dec 01 09:13:44 compute-0 sudo[83831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:44 compute-0 sudo[83831]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:44 compute-0 sudo[83831]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 sudo[83879]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lizrvwrtqpzqhwdwxgnqdlxxizejtxee ; /usr/bin/python3'
Dec 01 09:13:44 compute-0 sudo[83879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:44 compute-0 sudo[83880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:44 compute-0 sudo[83880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:44 compute-0 sudo[83880]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 sudo[83907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:44 compute-0 sudo[83907]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:44 compute-0 sudo[83907]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:44 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:44 compute-0 python3[83886]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:13:44 compute-0 sudo[83932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 rm-daemon --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --name mgr.compute-0.htextg --force --tcp-ports 8765
Dec 01 09:13:44 compute-0 sudo[83932]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:45 compute-0 podman[83958]: 2025-12-01 09:13:45.06476909 +0000 UTC m=+0.053854209 container create 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:13:45 compute-0 systemd[1]: Started libpod-conmon-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope.
Dec 01 09:13:45 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:45 compute-0 podman[83958]: 2025-12-01 09:13:45.048463016 +0000 UTC m=+0.037548135 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:13:45 compute-0 podman[83958]: 2025-12-01 09:13:45.13859935 +0000 UTC m=+0.127684489 container init 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:13:45 compute-0 podman[83958]: 2025-12-01 09:13:45.147754392 +0000 UTC m=+0.136839511 container start 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:45 compute-0 podman[83958]: 2025-12-01 09:13:45.153040829 +0000 UTC m=+0.142125938 container attach 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:13:45 compute-0 systemd[1]: Stopping Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: Added host compute-0
Dec 01 09:13:45 compute-0 ceph-mon[75031]: Saving service mon spec with placement compute-0
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: Saving service mgr spec with placement compute-0
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: Marking host: compute-0 for OSDSpec preview refresh.
Dec 01 09:13:45 compute-0 ceph-mon[75031]: Saving service osd.default_drive_group spec with placement compute-0
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:45 compute-0 ceph-mon[75031]: Removing daemon mgr.compute-0.htextg from compute-0 -- ports [8765]
Dec 01 09:13:45 compute-0 podman[84053]: 2025-12-01 09:13:45.557846422 +0000 UTC m=+0.106772270 container died 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:13:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-2c2a8cd3227a37de55446263fb0fb96c95365808b997cd221ffa58d6e824aa46-merged.mount: Deactivated successfully.
Dec 01 09:13:45 compute-0 podman[84053]: 2025-12-01 09:13:45.625660934 +0000 UTC m=+0.174586782 container remove 81acc04116fbf4755e71fd111eeaa509f9560ae5f0189ee1e5afafef5403a9ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec 01 09:13:45 compute-0 bash[84053]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-htextg
Dec 01 09:13:45 compute-0 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.htextg.service: Main process exited, code=exited, status=143/n/a
Dec 01 09:13:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec 01 09:13:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2938897347' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:13:45 compute-0 objective_davinci[83973]: 
Dec 01 09:13:45 compute-0 objective_davinci[83973]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_WARN","checks":{"TOO_FEW_OSDS":{"severity":"HEALTH_WARN","summary":{"message":"OSD count 0 < osd_pool_default_size 1","count":1},"muted":false}},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":82,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":3,"num_osds":0,"num_up_osds":0,"osd_up_since":0,"num_in_osds":0,"osd_in_since":0,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[],"num_pgs":0,"num_pools":0,"num_objects":0,"data_bytes":0,"bytes_used":0,"bytes_avail":0,"bytes_total":0},"fsmap":{"epoch":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":1,"modified":"2025-12-01T09:12:20.101670+0000","services":{}},"progress_events":{"0ad1d74c-45e7-464b-841d-9ea23a988291":{"message":"Updating mgr deployment (+1 -> 2) (0s)\n      [............................] ","progress":0,"add_to_ceph_s":true}}}
Dec 01 09:13:45 compute-0 systemd[1]: libpod-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope: Deactivated successfully.
Dec 01 09:13:45 compute-0 conmon[83973]: conmon 5912e43bcb69d1324ee3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope/container/memory.events
Dec 01 09:13:45 compute-0 podman[83958]: 2025-12-01 09:13:45.835070908 +0000 UTC m=+0.824156027 container died 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:13:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-042dc70dd7bc9edd0275eb290c5e7b3e95029da508c8d83d23a5f5afc7241b8e-merged.mount: Deactivated successfully.
Dec 01 09:13:46 compute-0 podman[83958]: 2025-12-01 09:13:46.040735182 +0000 UTC m=+1.029820301 container remove 5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72 (image=quay.io/ceph/ceph:v18, name=objective_davinci, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:13:46 compute-0 systemd[1]: libpod-conmon-5912e43bcb69d1324ee3167f03cd8e94b04ef56eb35f9f38109062f625844f72.scope: Deactivated successfully.
Dec 01 09:13:46 compute-0 sudo[83879]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:46 compute-0 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.htextg.service: Failed with result 'exit-code'.
Dec 01 09:13:46 compute-0 systemd[1]: Stopped Ceph mgr.compute-0.htextg for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:13:46 compute-0 systemd[1]: ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.htextg.service: Consumed 3.727s CPU time.
Dec 01 09:13:46 compute-0 systemd[1]: Reloading.
Dec 01 09:13:46 compute-0 systemd-rc-local-generator[84165]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:13:46 compute-0 systemd-sysv-generator[84170]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:13:46 compute-0 ceph-mon[75031]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:46 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2938897347' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:13:46 compute-0 sudo[83932]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:46 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.compute-0.htextg
Dec 01 09:13:46 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Removing key for mgr.compute-0.htextg
Dec 01 09:13:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.compute-0.htextg"} v 0) v1
Dec 01 09:13:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]: dispatch
Dec 01 09:13:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]': finished
Dec 01 09:13:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec 01 09:13:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:46 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev a0021272-0d49-43fd-b7af-09d21e2bb5e5 (Updating mgr deployment (-1 -> 1))
Dec 01 09:13:46 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event a0021272-0d49-43fd-b7af-09d21e2bb5e5 (Updating mgr deployment (-1 -> 1)) in 2 seconds
Dec 01 09:13:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) v1
Dec 01 09:13:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:46 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 83de6e1c-582f-40e3-9017-57f8f353e9e3 does not exist
Dec 01 09:13:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:13:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:13:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:13:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:13:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:13:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:46 compute-0 sudo[84177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:46 compute-0 sudo[84177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:46 compute-0 sudo[84177]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:46 compute-0 sudo[84202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:13:46 compute-0 sudo[84202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:46 compute-0 sudo[84202]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:46 compute-0 sudo[84227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:13:46 compute-0 sudo[84227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:46 compute-0 sudo[84227]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:46 compute-0 sudo[84252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:13:46 compute-0 sudo[84252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:13:46 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:47 compute-0 podman[84315]: 2025-12-01 09:13:47.047013324 +0000 UTC m=+0.051451748 container create 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:13:47 compute-0 systemd[1]: Started libpod-conmon-7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf.scope.
Dec 01 09:13:47 compute-0 podman[84315]: 2025-12-01 09:13:47.025173936 +0000 UTC m=+0.029612400 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:47 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:47 compute-0 podman[84315]: 2025-12-01 09:13:47.15001186 +0000 UTC m=+0.154450294 container init 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:13:47 compute-0 podman[84315]: 2025-12-01 09:13:47.157336528 +0000 UTC m=+0.161774942 container start 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:13:47 compute-0 magical_pascal[84332]: 167 167
Dec 01 09:13:47 compute-0 systemd[1]: libpod-7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf.scope: Deactivated successfully.
Dec 01 09:13:47 compute-0 podman[84315]: 2025-12-01 09:13:47.177060143 +0000 UTC m=+0.181498547 container attach 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 01 09:13:47 compute-0 podman[84315]: 2025-12-01 09:13:47.17796415 +0000 UTC m=+0.182402574 container died 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:13:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-3209a735490b9315d8878b71ea8f8959e60d36f0c3e955bfe2843293b3befc36-merged.mount: Deactivated successfully.
Dec 01 09:13:47 compute-0 podman[84315]: 2025-12-01 09:13:47.218585935 +0000 UTC m=+0.223024349 container remove 7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=magical_pascal, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:13:47 compute-0 systemd[1]: libpod-conmon-7a9ca4b8a8914da99565ecea5c24dcd36d79f5e64ead9406b49ff7facea404cf.scope: Deactivated successfully.
Dec 01 09:13:47 compute-0 podman[84355]: 2025-12-01 09:13:47.382690305 +0000 UTC m=+0.026745445 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:13:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:47 compute-0 podman[84355]: 2025-12-01 09:13:47.988212444 +0000 UTC m=+0.632267564 container create a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:13:47 compute-0 ceph-mon[75031]: Removing key for mgr.compute-0.htextg
Dec 01 09:13:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]: dispatch
Dec 01 09:13:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth rm", "entity": "mgr.compute-0.htextg"}]': finished
Dec 01 09:13:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:13:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:13:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:13:48 compute-0 ceph-mgr[75324]: [progress INFO root] Writing back 3 completed events
Dec 01 09:13:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec 01 09:13:48 compute-0 systemd[1]: Started libpod-conmon-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope.
Dec 01 09:13:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:48 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:13:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:13:48 compute-0 podman[84355]: 2025-12-01 09:13:48.157498318 +0000 UTC m=+0.801553448 container init a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:13:48 compute-0 podman[84355]: 2025-12-01 09:13:48.166839665 +0000 UTC m=+0.810894785 container start a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec 01 09:13:48 compute-0 podman[84355]: 2025-12-01 09:13:48.1774227 +0000 UTC m=+0.821477830 container attach a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:13:48 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:49 compute-0 ceph-mon[75031]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: --> relative data size: 1.0
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 9cfc4d29-4b80-4e2d-94cb-e544135847a5
Dec 01 09:13:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"} v 0) v1
Dec 01 09:13:49 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]: dispatch
Dec 01 09:13:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 do_prune osdmap full prune enabled
Dec 01 09:13:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e3 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:13:49 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]': finished
Dec 01 09:13:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 e4: 1 total, 0 up, 1 in
Dec 01 09:13:49 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e4: 1 total, 0 up, 1 in
Dec 01 09:13:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:13:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:13:49 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Dec 01 09:13:49 compute-0 lvm[84432]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:13:49 compute-0 lvm[84432]: VG ceph_vg0 finished
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 01 09:13:49 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Dec 01 09:13:50 compute-0 ceph-mon[75031]: pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:50 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]: dispatch
Dec 01 09:13:50 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3687895600' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5"}]': finished
Dec 01 09:13:50 compute-0 ceph-mon[75031]: osdmap e4: 1 total, 0 up, 1 in
Dec 01 09:13:50 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:13:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Dec 01 09:13:50 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2261522954' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:13:50 compute-0 affectionate_goldwasser[84371]:  stderr: got monmap epoch 1
Dec 01 09:13:50 compute-0 affectionate_goldwasser[84371]: --> Creating keyring file for osd.0
Dec 01 09:13:50 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Dec 01 09:13:50 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Dec 01 09:13:50 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 9cfc4d29-4b80-4e2d-94cb-e544135847a5 --setuser ceph --setgroup ceph
Dec 01 09:13:50 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:51 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Dec 01 09:13:51 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec 01 09:13:51 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2261522954' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:13:52 compute-0 ceph-mon[75031]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:52 compute-0 ceph-mon[75031]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Dec 01 09:13:52 compute-0 ceph-mon[75031]: Cluster is now healthy
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:50.419+0000 7f73f023d740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 01 09:13:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 01 09:13:52 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v15: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 01 09:13:52 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new b055e1b3-f94e-4d5e-be04-bafc3cd07aa2
Dec 01 09:13:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"} v 0) v1
Dec 01 09:13:53 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]: dispatch
Dec 01 09:13:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 do_prune osdmap full prune enabled
Dec 01 09:13:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e4 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:13:53 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]': finished
Dec 01 09:13:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e5 e5: 2 total, 0 up, 2 in
Dec 01 09:13:53 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e5: 2 total, 0 up, 2 in
Dec 01 09:13:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:13:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:13:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:13:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:13:53 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:13:53 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:13:53 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]: dispatch
Dec 01 09:13:53 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3614368394' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2"}]': finished
Dec 01 09:13:53 compute-0 ceph-mon[75031]: osdmap e5: 2 total, 0 up, 2 in
Dec 01 09:13:53 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:13:53 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:13:53 compute-0 lvm[85371]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 01 09:13:53 compute-0 lvm[85371]: VG ceph_vg1 finished
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Dec 01 09:13:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Dec 01 09:13:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1284577700' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:13:53 compute-0 affectionate_goldwasser[84371]:  stderr: got monmap epoch 1
Dec 01 09:13:54 compute-0 affectionate_goldwasser[84371]: --> Creating keyring file for osd.1
Dec 01 09:13:54 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Dec 01 09:13:54 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Dec 01 09:13:54 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid b055e1b3-f94e-4d5e-be04-bafc3cd07aa2 --setuser ceph --setgroup ceph
Dec 01 09:13:54 compute-0 ceph-mon[75031]: pgmap v15: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:54 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1284577700' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:13:54 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v17: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:56 compute-0 ceph-mon[75031]: pgmap v17: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:56 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v18: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:54.102+0000 7f3e917a0740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm activate successful for osd ID: 1
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new c0c71a6c-e9f0-420a-90ae-6660eaf041be
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"} v 0) v1
Dec 01 09:13:57 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]: dispatch
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e5 do_prune osdmap full prune enabled
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e5 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:13:57 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]': finished
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 e6: 3 total, 0 up, 3 in
Dec 01 09:13:57 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e6: 3 total, 0 up, 3 in
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:13:57 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:13:57 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:13:57 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:13:57 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:13:57 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:13:57 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:13:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:13:57 compute-0 lvm[86308]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 01 09:13:57 compute-0 lvm[86308]: VG ceph_vg2 finished
Dec 01 09:13:57 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg2/ceph_lv2
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -s /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Dec 01 09:13:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Dec 01 09:13:58 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/54861264' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]:  stderr: got monmap epoch 1
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: --> Creating keyring file for osd.2
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Dec 01 09:13:58 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid c0c71a6c-e9f0-420a-90ae-6660eaf041be --setuser ceph --setgroup ceph
Dec 01 09:13:58 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v20: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:59 compute-0 ceph-mon[75031]: pgmap v18: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:13:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]: dispatch
Dec 01 09:13:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/407475950' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be"}]': finished
Dec 01 09:13:59 compute-0 ceph-mon[75031]: osdmap e6: 3 total, 0 up, 3 in
Dec 01 09:13:59 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:13:59 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:13:59 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:13:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/54861264' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:14:00 compute-0 ceph-mon[75031]: pgmap v20: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:00 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v21: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]:  stderr: 2025-12-01T09:13:58.536+0000 7f8f66a21740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm prepare successful for: ceph_vg2/ceph_lv2
Dec 01 09:14:02 compute-0 ceph-mon[75031]: pgmap v21: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg2/ceph_lv2 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/ln -snf /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 01 09:14:02 compute-0 affectionate_goldwasser[84371]: --> ceph-volume lvm create successful for: ceph_vg2/ceph_lv2
Dec 01 09:14:02 compute-0 systemd[1]: libpod-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope: Deactivated successfully.
Dec 01 09:14:02 compute-0 systemd[1]: libpod-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope: Consumed 6.871s CPU time.
Dec 01 09:14:02 compute-0 podman[87222]: 2025-12-01 09:14:02.656689128 +0000 UTC m=+0.029793369 container died a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-74c7265ba961e34952af6201846e515b128d80cd0e7d4202dbbaa89b7350da2c-merged.mount: Deactivated successfully.
Dec 01 09:14:02 compute-0 podman[87222]: 2025-12-01 09:14:02.7466415 +0000 UTC m=+0.119745721 container remove a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:02 compute-0 systemd[1]: libpod-conmon-a71b4729cdb67de43cbbf02d39a700a839e1563588bb61b98e6570017b7dcd44.scope: Deactivated successfully.
Dec 01 09:14:02 compute-0 sudo[84252]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:02 compute-0 sudo[87237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:02 compute-0 sudo[87237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:02 compute-0 sudo[87237]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:02 compute-0 sudo[87262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:02 compute-0 sudo[87262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:02 compute-0 sudo[87262]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:02 compute-0 sudo[87287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:02 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v22: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:02 compute-0 sudo[87287]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:02 compute-0 sudo[87287]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:03 compute-0 sudo[87312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:14:03 compute-0 sudo[87312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:03 compute-0 podman[87376]: 2025-12-01 09:14:03.332683353 +0000 UTC m=+0.042444155 container create 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:14:03 compute-0 systemd[1]: Started libpod-conmon-21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46.scope.
Dec 01 09:14:03 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:03 compute-0 podman[87376]: 2025-12-01 09:14:03.400848321 +0000 UTC m=+0.110609143 container init 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:03 compute-0 podman[87376]: 2025-12-01 09:14:03.310503997 +0000 UTC m=+0.020264819 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:03 compute-0 podman[87376]: 2025-12-01 09:14:03.409169565 +0000 UTC m=+0.118930377 container start 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:14:03 compute-0 podman[87376]: 2025-12-01 09:14:03.414008262 +0000 UTC m=+0.123769094 container attach 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:03 compute-0 pensive_blackburn[87393]: 167 167
Dec 01 09:14:03 compute-0 systemd[1]: libpod-21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46.scope: Deactivated successfully.
Dec 01 09:14:03 compute-0 podman[87376]: 2025-12-01 09:14:03.415636082 +0000 UTC m=+0.125396884 container died 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-1e71f6d1e2062245aae319a32ca70d58fc0e1c77ba9019ffa5c7c1beba0d42e4-merged.mount: Deactivated successfully.
Dec 01 09:14:03 compute-0 podman[87376]: 2025-12-01 09:14:03.451590578 +0000 UTC m=+0.161351380 container remove 21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_blackburn, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:03 compute-0 systemd[1]: libpod-conmon-21b4e40f3018825ba802520d8ae28186e72543cfcf796a03e8b27dc03e98be46.scope: Deactivated successfully.
Dec 01 09:14:03 compute-0 podman[87418]: 2025-12-01 09:14:03.674250375 +0000 UTC m=+0.105748135 container create 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:03 compute-0 podman[87418]: 2025-12-01 09:14:03.590037798 +0000 UTC m=+0.021535578 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:03 compute-0 systemd[1]: Started libpod-conmon-6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a.scope.
Dec 01 09:14:03 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:03 compute-0 podman[87418]: 2025-12-01 09:14:03.751872121 +0000 UTC m=+0.183369901 container init 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:14:03 compute-0 podman[87418]: 2025-12-01 09:14:03.762414302 +0000 UTC m=+0.193912062 container start 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:14:03 compute-0 podman[87418]: 2025-12-01 09:14:03.766080644 +0000 UTC m=+0.197578404 container attach 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:14:04 compute-0 ceph-mon[75031]: pgmap v22: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]: {
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:     "0": [
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:         {
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "devices": [
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "/dev/loop3"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             ],
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_name": "ceph_lv0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_size": "21470642176",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "name": "ceph_lv0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "tags": {
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.crush_device_class": "",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.encrypted": "0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osd_id": "0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.type": "block",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.vdo": "0"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             },
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "type": "block",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "vg_name": "ceph_vg0"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:         }
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:     ],
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:     "1": [
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:         {
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "devices": [
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "/dev/loop4"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             ],
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_name": "ceph_lv1",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_size": "21470642176",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "name": "ceph_lv1",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "tags": {
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.crush_device_class": "",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.encrypted": "0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osd_id": "1",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.type": "block",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.vdo": "0"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             },
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "type": "block",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "vg_name": "ceph_vg1"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:         }
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:     ],
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:     "2": [
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:         {
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "devices": [
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "/dev/loop5"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             ],
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_name": "ceph_lv2",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_size": "21470642176",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "name": "ceph_lv2",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "tags": {
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.crush_device_class": "",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.encrypted": "0",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osd_id": "2",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.type": "block",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:                 "ceph.vdo": "0"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             },
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "type": "block",
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:             "vg_name": "ceph_vg2"
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:         }
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]:     ]
Dec 01 09:14:04 compute-0 optimistic_kepler[87434]: }
Dec 01 09:14:04 compute-0 systemd[1]: libpod-6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a.scope: Deactivated successfully.
Dec 01 09:14:04 compute-0 podman[87418]: 2025-12-01 09:14:04.579632483 +0000 UTC m=+1.011130253 container died 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-f91b24bdf23bba306838d7ed18b395c389a542559bb195576461ee2fe9080490-merged.mount: Deactivated successfully.
Dec 01 09:14:04 compute-0 podman[87418]: 2025-12-01 09:14:04.652935667 +0000 UTC m=+1.084433427 container remove 6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_kepler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:04 compute-0 systemd[1]: libpod-conmon-6c69176851dce42f6e5fa5ce4491073b900053f8970554903819aace9c77749a.scope: Deactivated successfully.
Dec 01 09:14:04 compute-0 sudo[87312]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) v1
Dec 01 09:14:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec 01 09:14:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:04 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon osd.0 on compute-0
Dec 01 09:14:04 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon osd.0 on compute-0
Dec 01 09:14:04 compute-0 sudo[87456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:04 compute-0 sudo[87456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:04 compute-0 sudo[87456]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:04 compute-0 sudo[87481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:04 compute-0 sudo[87481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:04 compute-0 sudo[87481]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:04 compute-0 sudo[87506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:04 compute-0 sudo[87506]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:04 compute-0 sudo[87506]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:04 compute-0 sudo[87531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:14:04 compute-0 sudo[87531]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:04 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v23: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:05 compute-0 podman[87596]: 2025-12-01 09:14:05.278386432 +0000 UTC m=+0.035631287 container create 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:14:05 compute-0 systemd[1]: Started libpod-conmon-156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a.scope.
Dec 01 09:14:05 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:05 compute-0 podman[87596]: 2025-12-01 09:14:05.355864984 +0000 UTC m=+0.113109879 container init 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:14:05 compute-0 podman[87596]: 2025-12-01 09:14:05.263483888 +0000 UTC m=+0.020728763 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:05 compute-0 podman[87596]: 2025-12-01 09:14:05.363130636 +0000 UTC m=+0.120375511 container start 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:05 compute-0 podman[87596]: 2025-12-01 09:14:05.366775797 +0000 UTC m=+0.124020652 container attach 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:05 compute-0 vigilant_jemison[87613]: 167 167
Dec 01 09:14:05 compute-0 systemd[1]: libpod-156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a.scope: Deactivated successfully.
Dec 01 09:14:05 compute-0 podman[87596]: 2025-12-01 09:14:05.369228332 +0000 UTC m=+0.126473217 container died 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:14:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-b126c9246aee2325621ddce099efc4957757572fdf530463c07d0480948e7d2d-merged.mount: Deactivated successfully.
Dec 01 09:14:05 compute-0 podman[87596]: 2025-12-01 09:14:05.408200899 +0000 UTC m=+0.165445754 container remove 156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_jemison, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:05 compute-0 systemd[1]: libpod-conmon-156239ae54068a2b081d91ebb6820de04044872a52a8c32302c4451159369d8a.scope: Deactivated successfully.
Dec 01 09:14:05 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec 01 09:14:05 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:05 compute-0 ceph-mon[75031]: Deploying daemon osd.0 on compute-0
Dec 01 09:14:05 compute-0 podman[87645]: 2025-12-01 09:14:05.67294772 +0000 UTC m=+0.055293397 container create f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:14:05 compute-0 systemd[1]: Started libpod-conmon-f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d.scope.
Dec 01 09:14:05 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:05 compute-0 podman[87645]: 2025-12-01 09:14:05.645932296 +0000 UTC m=+0.028278003 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:05 compute-0 podman[87645]: 2025-12-01 09:14:05.740784137 +0000 UTC m=+0.123129914 container init f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:05 compute-0 podman[87645]: 2025-12-01 09:14:05.748890004 +0000 UTC m=+0.131235721 container start f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:05 compute-0 podman[87645]: 2025-12-01 09:14:05.756900349 +0000 UTC m=+0.139246056 container attach f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:14:06 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test[87661]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 01 09:14:06 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test[87661]:                             [--no-systemd] [--no-tmpfs]
Dec 01 09:14:06 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test[87661]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 01 09:14:06 compute-0 systemd[1]: libpod-f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d.scope: Deactivated successfully.
Dec 01 09:14:06 compute-0 podman[87645]: 2025-12-01 09:14:06.461177917 +0000 UTC m=+0.843523604 container died f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-3c5df66c4dda97f92248b2fc432ebcb9e977cd94c3b7d22619468b7e7b60bc43-merged.mount: Deactivated successfully.
Dec 01 09:14:06 compute-0 podman[87645]: 2025-12-01 09:14:06.525701984 +0000 UTC m=+0.908047671 container remove f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:06 compute-0 systemd[1]: libpod-conmon-f30a799c7ae9cf6f636f9a3a94ebf1b797bc3472a13b87cd24c281d56b6ede3d.scope: Deactivated successfully.
Dec 01 09:14:06 compute-0 ceph-mon[75031]: pgmap v23: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:06 compute-0 systemd[1]: Reloading.
Dec 01 09:14:06 compute-0 systemd-rc-local-generator[87723]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:14:06 compute-0 systemd-sysv-generator[87727]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:14:06 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v24: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:07 compute-0 systemd[1]: Reloading.
Dec 01 09:14:07 compute-0 systemd-rc-local-generator[87762]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:14:07 compute-0 systemd-sysv-generator[87766]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:14:07 compute-0 systemd[1]: Starting Ceph osd.0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:14:07 compute-0 podman[87823]: 2025-12-01 09:14:07.509962465 +0000 UTC m=+0.047675165 container create 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:07 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:07 compute-0 podman[87823]: 2025-12-01 09:14:07.486812549 +0000 UTC m=+0.024525249 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:07 compute-0 podman[87823]: 2025-12-01 09:14:07.592906373 +0000 UTC m=+0.130619083 container init 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:14:07 compute-0 podman[87823]: 2025-12-01 09:14:07.600442943 +0000 UTC m=+0.138155643 container start 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:14:07 compute-0 podman[87823]: 2025-12-01 09:14:07.603732143 +0000 UTC m=+0.141444943 container attach 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:07 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:08 compute-0 ceph-mon[75031]: pgmap v24: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:08 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 01 09:14:08 compute-0 bash[87823]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 01 09:14:08 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 01 09:14:08 compute-0 bash[87823]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 01 09:14:08 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 01 09:14:08 compute-0 bash[87823]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 01 09:14:08 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:14:08 compute-0 bash[87823]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:14:08 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:08 compute-0 bash[87823]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:08 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 01 09:14:08 compute-0 bash[87823]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 01 09:14:08 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate[87838]: --> ceph-volume raw activate successful for osd ID: 0
Dec 01 09:14:08 compute-0 bash[87823]: --> ceph-volume raw activate successful for osd ID: 0
Dec 01 09:14:08 compute-0 systemd[1]: libpod-4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa.scope: Deactivated successfully.
Dec 01 09:14:08 compute-0 podman[87823]: 2025-12-01 09:14:08.646513319 +0000 UTC m=+1.184226019 container died 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:14:08 compute-0 systemd[1]: libpod-4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa.scope: Consumed 1.058s CPU time.
Dec 01 09:14:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-68b7957790c37820780a9cdb80437b209de3b565a5fd04c39aadc5c6cb832d70-merged.mount: Deactivated successfully.
Dec 01 09:14:08 compute-0 podman[87823]: 2025-12-01 09:14:08.696107551 +0000 UTC m=+1.233820251 container remove 4903056ca526f0dcffa566d26a6e32bfec448b5ce4fee82ca77c671653f943fa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:14:08 compute-0 podman[88028]: 2025-12-01 09:14:08.882165592 +0000 UTC m=+0.037619857 container create b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:14:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7352c180718802d143b8e0ceafb347d0f8d7a91c0b12986f42a4ab00b2eea713/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:08 compute-0 podman[88028]: 2025-12-01 09:14:08.941714557 +0000 UTC m=+0.097168852 container init b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:14:08 compute-0 podman[88028]: 2025-12-01 09:14:08.947506214 +0000 UTC m=+0.102960479 container start b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:14:08 compute-0 bash[88028]: b27d497db5b169524d5d8a1837eab1f4cd862a203a76444a6067c9edec102279
Dec 01 09:14:08 compute-0 podman[88028]: 2025-12-01 09:14:08.865407951 +0000 UTC m=+0.020862246 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:08 compute-0 systemd[1]: Started Ceph osd.0 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:14:08 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:08 compute-0 ceph-osd[88047]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:14:08 compute-0 ceph-osd[88047]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Dec 01 09:14:08 compute-0 ceph-osd[88047]: pidfile_write: ignore empty --pid-file
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 01 09:14:08 compute-0 sudo[87531]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:08 compute-0 ceph-osd[88047]: bdev(0x55c737181800 /var/lib/ceph/osd/ceph-0/block) close
Dec 01 09:14:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) v1
Dec 01 09:14:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec 01 09:14:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:09 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon osd.1 on compute-0
Dec 01 09:14:09 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon osd.1 on compute-0
Dec 01 09:14:09 compute-0 sudo[88060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:09 compute-0 sudo[88060]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:09 compute-0 sudo[88060]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:09 compute-0 sudo[88085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:09 compute-0 sudo[88085]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:09 compute-0 sudo[88085]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:09 compute-0 sudo[88110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:09 compute-0 sudo[88110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:09 compute-0 sudo[88110]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:09 compute-0 sudo[88135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:14:09 compute-0 sudo[88135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c73633f800 /var/lib/ceph/osd/ceph-0/block) close
Dec 01 09:14:09 compute-0 ceph-osd[88047]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Dec 01 09:14:09 compute-0 ceph-osd[88047]: load: jerasure load: lrc 
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 01 09:14:09 compute-0 podman[88203]: 2025-12-01 09:14:09.550386701 +0000 UTC m=+0.039033181 container create bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:09 compute-0 systemd[1]: Started libpod-conmon-bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f.scope.
Dec 01 09:14:09 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:09 compute-0 podman[88203]: 2025-12-01 09:14:09.533838777 +0000 UTC m=+0.022485277 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:09 compute-0 podman[88203]: 2025-12-01 09:14:09.644349385 +0000 UTC m=+0.132995885 container init bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:09 compute-0 podman[88203]: 2025-12-01 09:14:09.650897645 +0000 UTC m=+0.139544125 container start bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:09 compute-0 podman[88203]: 2025-12-01 09:14:09.653848815 +0000 UTC m=+0.142495295 container attach bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:09 compute-0 busy_shaw[88224]: 167 167
Dec 01 09:14:09 compute-0 systemd[1]: libpod-bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f.scope: Deactivated successfully.
Dec 01 09:14:09 compute-0 podman[88203]: 2025-12-01 09:14:09.655875296 +0000 UTC m=+0.144521776 container died bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:14:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-713ca73f911a02f074e1d9102592b8b5c854294e32711516ee336769910d7809-merged.mount: Deactivated successfully.
Dec 01 09:14:09 compute-0 podman[88203]: 2025-12-01 09:14:09.694732331 +0000 UTC m=+0.183378811 container remove bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_shaw, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:14:09 compute-0 systemd[1]: libpod-conmon-bea8b85e16d5a1a49ff987beb9e59dcd60b9d816520b5f660ccc648b13606f8f.scope: Deactivated successfully.
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:09 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 01 09:14:09 compute-0 podman[88259]: 2025-12-01 09:14:09.939215833 +0000 UTC m=+0.048569721 container create a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:09 compute-0 systemd[1]: Started libpod-conmon-a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25.scope.
Dec 01 09:14:09 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:10 compute-0 podman[88259]: 2025-12-01 09:14:10.010360922 +0000 UTC m=+0.119714840 container init a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:14:10 compute-0 podman[88259]: 2025-12-01 09:14:09.915533811 +0000 UTC m=+0.024887729 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:10 compute-0 ceph-mon[75031]: pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec 01 09:14:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:10 compute-0 ceph-mon[75031]: Deploying daemon osd.1 on compute-0
Dec 01 09:14:10 compute-0 podman[88259]: 2025-12-01 09:14:10.017947023 +0000 UTC m=+0.127300911 container start a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:14:10 compute-0 podman[88259]: 2025-12-01 09:14:10.021412429 +0000 UTC m=+0.130766347 container attach a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737202c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs mount
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs mount shared_bdev_used = 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Git sha 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: DB SUMMARY
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: DB Session ID:  Z9IFZ8MDJU8BBS3TV8NA
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                     Options.env: 0x55c7371d3c70
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                Options.info_log: 0x55c7363c68a0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.write_buffer_manager: 0x55c7372dc460
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.row_cache: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                              Options.wal_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.wal_compression: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Compression algorithms supported:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kZSTD supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c62c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c6240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b3090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c6240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b3090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7363c6240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b3090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 801eb657-3ccc-48c9-95d8-faada9292b70
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450082975, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450083179, "job": 1, "event": "recovery_finished"}
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: freelist init
Dec 01 09:14:10 compute-0 ceph-osd[88047]: freelist _read_cfg
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs umount
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) close
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bdev(0x55c737203400 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs mount
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluefs mount shared_bdev_used = 4718592
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Git sha 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: DB SUMMARY
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: DB Session ID:  Z9IFZ8MDJU8BBS3TV8NB
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                     Options.env: 0x55c7371d3f10
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                Options.info_log: 0x55c7363c6360
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.write_buffer_manager: 0x55c7372dc6e0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.row_cache: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                              Options.wal_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.wal_compression: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Compression algorithms supported:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kZSTD supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf560)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b31f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf580)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b3090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf580)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b3090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c7371cf580)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c7363b3090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 801eb657-3ccc-48c9-95d8-faada9292b70
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450360067, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450364881, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580450, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "801eb657-3ccc-48c9-95d8-faada9292b70", "db_session_id": "Z9IFZ8MDJU8BBS3TV8NB", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450367819, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580450, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "801eb657-3ccc-48c9-95d8-faada9292b70", "db_session_id": "Z9IFZ8MDJU8BBS3TV8NB", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450370457, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580450, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "801eb657-3ccc-48c9-95d8-faada9292b70", "db_session_id": "Z9IFZ8MDJU8BBS3TV8NB", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580450371782, "job": 1, "event": "recovery_finished"}
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55c736521c00
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: DB pointer 0x55c7372c5a00
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Dec 01 09:14:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:14:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:14:10 compute-0 ceph-osd[88047]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 01 09:14:10 compute-0 ceph-osd[88047]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 01 09:14:10 compute-0 ceph-osd[88047]: _get_class not permitted to load lua
Dec 01 09:14:10 compute-0 ceph-osd[88047]: _get_class not permitted to load sdk
Dec 01 09:14:10 compute-0 ceph-osd[88047]: _get_class not permitted to load test_remote_reads
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0 0 load_pgs
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0 0 load_pgs opened 0 pgs
Dec 01 09:14:10 compute-0 ceph-osd[88047]: osd.0 0 log_to_monitors true
Dec 01 09:14:10 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0[88043]: 2025-12-01T09:14:10.414+0000 7f107cb51740 -1 osd.0 0 log_to_monitors true
Dec 01 09:14:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]} v 0) v1
Dec 01 09:14:10 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Dec 01 09:14:10 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test[88275]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 01 09:14:10 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test[88275]:                             [--no-systemd] [--no-tmpfs]
Dec 01 09:14:10 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test[88275]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 01 09:14:10 compute-0 systemd[1]: libpod-a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25.scope: Deactivated successfully.
Dec 01 09:14:10 compute-0 podman[88259]: 2025-12-01 09:14:10.74387125 +0000 UTC m=+0.853225148 container died a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:14:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-3c21639ba04608201591abc3a56db0ee8dfc2c0ca08db065fb07b2e4a036ea93-merged.mount: Deactivated successfully.
Dec 01 09:14:10 compute-0 podman[88259]: 2025-12-01 09:14:10.795190094 +0000 UTC m=+0.904544002 container remove a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:10 compute-0 systemd[1]: libpod-conmon-a5cad3e9e8395fff46cceab38f7df36ef7eb6ace066a3f613bf1b3c958b1cf25.scope: Deactivated successfully.
Dec 01 09:14:10 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:11 compute-0 systemd[1]: Reloading.
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 do_prune osdmap full prune enabled
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e6 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:14:11 compute-0 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Dec 01 09:14:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 e7: 3 total, 0 up, 3 in
Dec 01 09:14:11 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e7: 3 total, 0 up, 3 in
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0) v1
Dec 01 09:14:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 create-or-move crush item name 'osd.0' initial_weight 0.0195 at location {host=compute-0,root=default}
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:11 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:11 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:11 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:11 compute-0 systemd-rc-local-generator[88745]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:14:11 compute-0 systemd-sysv-generator[88748]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:14:11 compute-0 systemd[1]: Reloading.
Dec 01 09:14:11 compute-0 systemd-sysv-generator[88788]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:14:11 compute-0 systemd-rc-local-generator[88785]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:14:11 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 01 09:14:11 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 01 09:14:11 compute-0 systemd[1]: Starting Ceph osd.1 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:14:11 compute-0 podman[88843]: 2025-12-01 09:14:11.774458934 +0000 UTC m=+0.038693380 container create 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:11 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:11 compute-0 podman[88843]: 2025-12-01 09:14:11.837584088 +0000 UTC m=+0.101818554 container init 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:11 compute-0 podman[88843]: 2025-12-01 09:14:11.848859082 +0000 UTC m=+0.113093528 container start 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:14:11 compute-0 podman[88843]: 2025-12-01 09:14:11.852637127 +0000 UTC m=+0.116871573 container attach 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:11 compute-0 podman[88843]: 2025-12-01 09:14:11.757723434 +0000 UTC m=+0.021957900 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 do_prune osdmap full prune enabled
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e7 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:14:12 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 e8: 3 total, 0 up, 3 in
Dec 01 09:14:12 compute-0 ceph-osd[88047]: osd.0 0 done with init, starting boot process
Dec 01 09:14:12 compute-0 ceph-osd[88047]: osd.0 0 start_boot
Dec 01 09:14:12 compute-0 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 01 09:14:12 compute-0 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 01 09:14:12 compute-0 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 01 09:14:12 compute-0 ceph-osd[88047]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 01 09:14:12 compute-0 ceph-osd[88047]: osd.0 0  bench count 12288000 bsize 4 KiB
Dec 01 09:14:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e8: 3 total, 0 up, 3 in
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:12 compute-0 ceph-mon[75031]: pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:12 compute-0 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Dec 01 09:14:12 compute-0 ceph-mon[75031]: osdmap e7: 3 total, 0 up, 3 in
Dec 01 09:14:12 compute-0 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 01 09:14:12 compute-0 bash[88843]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 01 09:14:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 01 09:14:12 compute-0 bash[88843]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v29: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:14:12
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:14:12 compute-0 ceph-mgr[75324]: [balancer INFO root] No pools available
Dec 01 09:14:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 01 09:14:12 compute-0 bash[88843]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 01 09:14:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 01 09:14:12 compute-0 bash[88843]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 01 09:14:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:12 compute-0 bash[88843]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 01 09:14:12 compute-0 bash[88843]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 01 09:14:12 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate[88859]: --> ceph-volume raw activate successful for osd ID: 1
Dec 01 09:14:12 compute-0 bash[88843]: --> ceph-volume raw activate successful for osd ID: 1
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:14:13 compute-0 systemd[1]: libpod-6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac.scope: Deactivated successfully.
Dec 01 09:14:13 compute-0 systemd[1]: libpod-6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac.scope: Consumed 1.197s CPU time.
Dec 01 09:14:13 compute-0 podman[88843]: 2025-12-01 09:14:13.032653426 +0000 UTC m=+1.296887872 container died 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:14:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:13 compute-0 ceph-mon[75031]: from='osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 01 09:14:13 compute-0 ceph-mon[75031]: osdmap e8: 3 total, 0 up, 3 in
Dec 01 09:14:13 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:13 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:13 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:13 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-f0c3eff9402a8799d4f2c5353443051a495ce202894b25216d82b35e900c90b1-merged.mount: Deactivated successfully.
Dec 01 09:14:13 compute-0 podman[88843]: 2025-12-01 09:14:13.154618584 +0000 UTC m=+1.418853030 container remove 6bc2fbcc342a035d8d509594a1c0281a207cd697fa9cb7c70c3a34c6dc652eac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec 01 09:14:13 compute-0 podman[89033]: 2025-12-01 09:14:13.341977375 +0000 UTC m=+0.022108915 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:13 compute-0 podman[89033]: 2025-12-01 09:14:13.442631553 +0000 UTC m=+0.122763083 container create 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec 01 09:14:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57d2f59c1ffba8320e050daaf2d47de0e88136e8bfe81b0b3234263632cbdfd9/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:13 compute-0 podman[89033]: 2025-12-01 09:14:13.665161245 +0000 UTC m=+0.345292775 container init 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:14:13 compute-0 podman[89033]: 2025-12-01 09:14:13.671740466 +0000 UTC m=+0.351871986 container start 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:14:13 compute-0 ceph-osd[89052]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:14:13 compute-0 ceph-osd[89052]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Dec 01 09:14:13 compute-0 ceph-osd[89052]: pidfile_write: ignore empty --pid-file
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f19f0d800 /var/lib/ceph/osd/ceph-1/block) close
Dec 01 09:14:13 compute-0 bash[89033]: 2203330e3b4c084a5051630983687321ba42178f6b21acc3ece7e642356ce8a9
Dec 01 09:14:13 compute-0 systemd[1]: Started Ceph osd.1 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:14:13 compute-0 sudo[88135]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) v1
Dec 01 09:14:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec 01 09:14:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon osd.2 on compute-0
Dec 01 09:14:13 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon osd.2 on compute-0
Dec 01 09:14:13 compute-0 sudo[89065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:13 compute-0 sudo[89065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:13 compute-0 sudo[89065]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:13 compute-0 ceph-osd[89052]: bdev(0x555f190d5800 /var/lib/ceph/osd/ceph-1/block) close
Dec 01 09:14:14 compute-0 sudo[89090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:14 compute-0 sudo[89090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:14 compute-0 sudo[89090]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:14 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:14 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:14 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:14 compute-0 ceph-mon[75031]: purged_snaps scrub starts
Dec 01 09:14:14 compute-0 ceph-mon[75031]: purged_snaps scrub ok
Dec 01 09:14:14 compute-0 ceph-mon[75031]: pgmap v29: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec 01 09:14:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:14 compute-0 ceph-mon[75031]: Deploying daemon osd.2 on compute-0
Dec 01 09:14:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:14 compute-0 sudo[89117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:14 compute-0 sudo[89117]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:14 compute-0 sudo[89117]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:14 compute-0 sudo[89142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:14:14 compute-0 sudo[89142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:14 compute-0 ceph-osd[89052]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Dec 01 09:14:14 compute-0 ceph-osd[89052]: load: jerasure load: lrc 
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) close
Dec 01 09:14:14 compute-0 podman[89215]: 2025-12-01 09:14:14.744373712 +0000 UTC m=+0.239929594 container create 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:14 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) close
Dec 01 09:14:14 compute-0 systemd[1]: Started libpod-conmon-0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980.scope.
Dec 01 09:14:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:14 compute-0 podman[89215]: 2025-12-01 09:14:14.724664351 +0000 UTC m=+0.220220253 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:14 compute-0 podman[89215]: 2025-12-01 09:14:14.832078586 +0000 UTC m=+0.327634488 container init 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:14 compute-0 podman[89215]: 2025-12-01 09:14:14.838819741 +0000 UTC m=+0.334375623 container start 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec 01 09:14:14 compute-0 flamboyant_heisenberg[89236]: 167 167
Dec 01 09:14:14 compute-0 systemd[1]: libpod-0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980.scope: Deactivated successfully.
Dec 01 09:14:14 compute-0 podman[89215]: 2025-12-01 09:14:14.863947797 +0000 UTC m=+0.359503699 container attach 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:14:14 compute-0 podman[89215]: 2025-12-01 09:14:14.864355359 +0000 UTC m=+0.359911241 container died 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:14:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-0e776f518f3fd9a519057cca5037d204571739e1cf2e78b5472cd05f523adbf7-merged.mount: Deactivated successfully.
Dec 01 09:14:14 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v30: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:15 compute-0 ceph-osd[89052]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8ec00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs mount
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:14:15 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs mount shared_bdev_used = 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Git sha 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: DB SUMMARY
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: DB Session ID:  NR1ZS73OTAFE67X1H6FN
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                     Options.env: 0x555f19f5fc70
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                Options.info_log: 0x555f1915c8a0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.write_buffer_manager: 0x555f1a068460
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.row_cache: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                              Options.wal_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.wal_compression: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Compression algorithms supported:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kZSTD supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:14:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:15 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:15 compute-0 podman[89215]: 2025-12-01 09:14:15.091161403 +0000 UTC m=+0.586717285 container remove 0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_heisenberg, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 systemd[1]: libpod-conmon-0c9713b52fcf1503ee3248574e2cf69bc30b1f8cd72a81e03c95b3d2b07e4980.scope: Deactivated successfully.
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c2c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f19149090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f19149090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1915c240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f19149090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5fcb8a6d-e992-4698-8305-75c619417288
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455130933, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455131142, "job": 1, "event": "recovery_finished"}
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: freelist init
Dec 01 09:14:15 compute-0 ceph-osd[89052]: freelist _read_cfg
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs umount
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) close
Dec 01 09:14:15 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bdev(0x555f19f8f400 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs mount
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluefs mount shared_bdev_used = 4718592
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Git sha 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: DB SUMMARY
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: DB Session ID:  NR1ZS73OTAFE67X1H6FM
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                     Options.env: 0x555f1a110380
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                Options.info_log: 0x555f19153280
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.write_buffer_manager: 0x555f1a0686e0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.row_cache: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                              Options.wal_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.wal_compression: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Compression algorithms supported:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kZSTD supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f1912fc80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f191491f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f19153840)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f19149090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f19153840)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f19149090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f19153840)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555f19149090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5fcb8a6d-e992-4698-8305-75c619417288
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455341591, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455395271, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580455, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fcb8a6d-e992-4698-8305-75c619417288", "db_session_id": "NR1ZS73OTAFE67X1H6FM", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455398873, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580455, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fcb8a6d-e992-4698-8305-75c619417288", "db_session_id": "NR1ZS73OTAFE67X1H6FM", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455401271, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580455, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fcb8a6d-e992-4698-8305-75c619417288", "db_session_id": "NR1ZS73OTAFE67X1H6FM", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580455402883, "job": 1, "event": "recovery_finished"}
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 01 09:14:15 compute-0 podman[89644]: 2025-12-01 09:14:15.440218553 +0000 UTC m=+0.070782859 container create ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x555f192b7c00
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: DB pointer 0x555f1a051a00
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Dec 01 09:14:15 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:14:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:14:15 compute-0 ceph-osd[89052]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 01 09:14:15 compute-0 ceph-osd[89052]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 01 09:14:15 compute-0 ceph-osd[89052]: _get_class not permitted to load lua
Dec 01 09:14:15 compute-0 ceph-osd[89052]: _get_class not permitted to load sdk
Dec 01 09:14:15 compute-0 ceph-osd[89052]: _get_class not permitted to load test_remote_reads
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1 0 load_pgs
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1 0 load_pgs opened 0 pgs
Dec 01 09:14:15 compute-0 ceph-osd[89052]: osd.1 0 log_to_monitors true
Dec 01 09:14:15 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1[89048]: 2025-12-01T09:14:15.480+0000 7f11a5e1b740 -1 osd.1 0 log_to_monitors true
Dec 01 09:14:15 compute-0 podman[89644]: 2025-12-01 09:14:15.394084977 +0000 UTC m=+0.024649283 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]} v 0) v1
Dec 01 09:14:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Dec 01 09:14:15 compute-0 systemd[1]: Started libpod-conmon-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope.
Dec 01 09:14:15 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:15 compute-0 podman[89644]: 2025-12-01 09:14:15.573428783 +0000 UTC m=+0.203993109 container init ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:14:15 compute-0 podman[89644]: 2025-12-01 09:14:15.582446928 +0000 UTC m=+0.213011234 container start ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:14:15 compute-0 podman[89644]: 2025-12-01 09:14:15.598951241 +0000 UTC m=+0.229515547 container attach ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:14:16 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:16 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:16 compute-0 sudo[89722]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itkvwartuqrpfhsoimldsvuvuqmymzqk ; /usr/bin/python3'
Dec 01 09:14:16 compute-0 sudo[89722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 do_prune osdmap full prune enabled
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e8 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:14:16 compute-0 ceph-mon[75031]: pgmap v30: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:16 compute-0 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Dec 01 09:14:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 e9: 3 total, 0 up, 3 in
Dec 01 09:14:16 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e9: 3 total, 0 up, 3 in
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:16 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:16 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:16 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0) v1
Dec 01 09:14:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec 01 09:14:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 create-or-move crush item name 'osd.1' initial_weight 0.0195 at location {host=compute-0,root=default}
Dec 01 09:14:16 compute-0 python3[89724]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:14:16 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 01 09:14:16 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 01 09:14:16 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test[89694]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 01 09:14:16 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test[89694]:                             [--no-systemd] [--no-tmpfs]
Dec 01 09:14:16 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test[89694]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 01 09:14:16 compute-0 systemd[1]: libpod-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope: Deactivated successfully.
Dec 01 09:14:16 compute-0 systemd[1]: libpod-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope: Consumed 1.083s CPU time.
Dec 01 09:14:16 compute-0 podman[89726]: 2025-12-01 09:14:16.834551315 +0000 UTC m=+0.246097233 container create 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:14:16 compute-0 podman[89726]: 2025-12-01 09:14:16.740269451 +0000 UTC m=+0.151815389 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:14:16 compute-0 podman[89644]: 2025-12-01 09:14:16.873248295 +0000 UTC m=+1.503812601 container died ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:14:16 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v32: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:17 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:17 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:17 compute-0 systemd[1]: Started libpod-conmon-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope.
Dec 01 09:14:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-d68077d642ca5a1114cd317028679bfdfa8a23b8e06cc700eadf92f64ec9e8dd-merged.mount: Deactivated successfully.
Dec 01 09:14:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:17 compute-0 podman[89644]: 2025-12-01 09:14:17.094076306 +0000 UTC m=+1.724640612 container remove ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate-test, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:14:17 compute-0 systemd[1]: libpod-conmon-ccf5f99d433310479d35aa3480666cdc8a5bbbf7b15712067271d9759035a22e.scope: Deactivated successfully.
Dec 01 09:14:17 compute-0 podman[89726]: 2025-12-01 09:14:17.117156179 +0000 UTC m=+0.528702127 container init 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:14:17 compute-0 podman[89726]: 2025-12-01 09:14:17.154871149 +0000 UTC m=+0.566417057 container start 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:14:17 compute-0 podman[89726]: 2025-12-01 09:14:17.16705348 +0000 UTC m=+0.578599428 container attach 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 do_prune osdmap full prune enabled
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e9 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:14:17 compute-0 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Dec 01 09:14:17 compute-0 ceph-mon[75031]: osdmap e9: 3 total, 0 up, 3 in
Dec 01 09:14:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 e10: 3 total, 0 up, 3 in
Dec 01 09:14:17 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e10: 3 total, 0 up, 3 in
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:17 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:17 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:17 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:17 compute-0 ceph-osd[89052]: osd.1 0 done with init, starting boot process
Dec 01 09:14:17 compute-0 ceph-osd[89052]: osd.1 0 start_boot
Dec 01 09:14:17 compute-0 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 01 09:14:17 compute-0 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 01 09:14:17 compute-0 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 01 09:14:17 compute-0 ceph-osd[89052]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 01 09:14:17 compute-0 ceph-osd[89052]: osd.1 0  bench count 12288000 bsize 4 KiB
Dec 01 09:14:17 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:17 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:17 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:17 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:17 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:17 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:17 compute-0 systemd[1]: Reloading.
Dec 01 09:14:17 compute-0 systemd-rc-local-generator[89802]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:14:17 compute-0 systemd-sysv-generator[89808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:14:17 compute-0 systemd[1]: Reloading.
Dec 01 09:14:17 compute-0 systemd-rc-local-generator[89858]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:14:17 compute-0 systemd-sysv-generator[89861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:14:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:18 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:18 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:18 compute-0 systemd[1]: Starting Ceph osd.2 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:14:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec 01 09:14:18 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2893147319' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:14:18 compute-0 amazing_leavitt[89757]: 
Dec 01 09:14:18 compute-0 amazing_leavitt[89757]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":115,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":10,"num_osds":3,"num_up_osds":0,"osd_up_since":0,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[],"num_pgs":0,"num_pools":0,"num_objects":0,"data_bytes":0,"bytes_used":0,"bytes_avail":0,"bytes_total":0},"fsmap":{"epoch":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":2,"modified":"2025-12-01T09:14:14.959091+0000","services":{}},"progress_events":{}}
Dec 01 09:14:18 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:18 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mon[75031]: pgmap v32: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:18 compute-0 ceph-mon[75031]: from='osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 01 09:14:18 compute-0 ceph-mon[75031]: osdmap e10: 3 total, 0 up, 3 in
Dec 01 09:14:18 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2893147319' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:14:18 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:18 compute-0 systemd[1]: libpod-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope: Deactivated successfully.
Dec 01 09:14:18 compute-0 systemd[1]: libpod-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope: Consumed 1.141s CPU time.
Dec 01 09:14:18 compute-0 podman[89726]: 2025-12-01 09:14:18.314320941 +0000 UTC m=+1.725866889 container died 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:14:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-e53e3256469721831714918dd626b9cf4dc03de3463be90f34cbdd78ed981626-merged.mount: Deactivated successfully.
Dec 01 09:14:18 compute-0 podman[89726]: 2025-12-01 09:14:18.876181887 +0000 UTC m=+2.287727805 container remove 5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695 (image=quay.io/ceph/ceph:v18, name=amazing_leavitt, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:14:18 compute-0 systemd[1]: libpod-conmon-5d8d77e5a01370fe464c11f4c5b8933ccd01f4a22286bd3bafa1ee6ee773a695.scope: Deactivated successfully.
Dec 01 09:14:18 compute-0 sudo[89722]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:18 compute-0 podman[89934]: 2025-12-01 09:14:18.930584016 +0000 UTC m=+0.507652036 container create 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:18 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v34: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:18 compute-0 podman[89934]: 2025-12-01 09:14:18.898149917 +0000 UTC m=+0.475217957 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:18 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:19 compute-0 podman[89934]: 2025-12-01 09:14:19.031804321 +0000 UTC m=+0.608872371 container init 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:19 compute-0 podman[89934]: 2025-12-01 09:14:19.037947128 +0000 UTC m=+0.615015148 container start 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:19 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/4016881853; not ready for session (expect reconnect)
Dec 01 09:14:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:19 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 01 09:14:19 compute-0 podman[89934]: 2025-12-01 09:14:19.061739544 +0000 UTC m=+0.638807564 container attach 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True)
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 6.477 iops: 1658.093 elapsed_sec: 1.809
Dec 01 09:14:19 compute-0 ceph-osd[88047]: log_channel(cluster) log [WRN] : OSD bench result of 1658.093265 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 0 waiting for initial osdmap
Dec 01 09:14:19 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0[88043]: 2025-12-01T09:14:19.119+0000 7f1078ad1640 -1 osd.0 0 waiting for initial osdmap
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 10 crush map has features 288514050185494528, adjusting msgr requires for clients
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 10 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 10 crush map has features 3314932999778484224, adjusting msgr requires for osds
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 10 check_osdmap_features require_osd_release unknown -> reef
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:14:19 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-0[88043]: 2025-12-01T09:14:19.209+0000 7f10740f9640 -1 osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 10 set_numa_affinity not setting numa affinity
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 10 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Dec 01 09:14:19 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 do_prune osdmap full prune enabled
Dec 01 09:14:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e10 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:14:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e11 e11: 3 total, 1 up, 3 in
Dec 01 09:14:19 compute-0 ceph-mon[75031]: purged_snaps scrub starts
Dec 01 09:14:19 compute-0 ceph-mon[75031]: purged_snaps scrub ok
Dec 01 09:14:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853] boot
Dec 01 09:14:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e11: 3 total, 1 up, 3 in
Dec 01 09:14:19 compute-0 ceph-osd[88047]: osd.0 11 state: booting -> active
Dec 01 09:14:19 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) v1
Dec 01 09:14:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:19 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:20 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:20 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:20 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:20 compute-0 ceph-mon[75031]: pgmap v34: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 01 09:14:20 compute-0 ceph-mon[75031]: OSD bench result of 1658.093265 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:14:20 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:20 compute-0 ceph-mon[75031]: osd.0 [v2:192.168.122.100:6802/4016881853,v1:192.168.122.100:6803/4016881853] boot
Dec 01 09:14:20 compute-0 ceph-mon[75031]: osdmap e11: 3 total, 1 up, 3 in
Dec 01 09:14:20 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:14:20 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:20 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:20 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v36: 0 pgs: ; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:21 compute-0 ceph-mgr[75324]: [devicehealth INFO root] creating mgr pool
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true} v 0) v1
Dec 01 09:14:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Dec 01 09:14:21 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:14:21 compute-0 bash[89934]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:14:21 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg2-ceph_lv2
Dec 01 09:14:21 compute-0 bash[89934]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg2-ceph_lv2
Dec 01 09:14:21 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg2-ceph_lv2
Dec 01 09:14:21 compute-0 bash[89934]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg2-ceph_lv2
Dec 01 09:14:21 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 01 09:14:21 compute-0 bash[89934]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 01 09:14:21 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:21 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:21 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg2-ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:21 compute-0 bash[89934]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg2-ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e11 do_prune osdmap full prune enabled
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e11 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 01 09:14:21 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:14:21 compute-0 bash[89934]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:14:21 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate[89951]: --> ceph-volume raw activate successful for osd ID: 2
Dec 01 09:14:21 compute-0 bash[89934]: --> ceph-volume raw activate successful for osd ID: 2
Dec 01 09:14:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Dec 01 09:14:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:21 compute-0 systemd[1]: libpod-0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4.scope: Deactivated successfully.
Dec 01 09:14:21 compute-0 podman[89934]: 2025-12-01 09:14:21.836238885 +0000 UTC m=+3.413306905 container died 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:21 compute-0 systemd[1]: libpod-0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4.scope: Consumed 2.860s CPU time.
Dec 01 09:14:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 e12: 3 total, 1 up, 3 in
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 3314933000852226048, adjusting msgr requires
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Dec 01 09:14:21 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e12: 3 total, 1 up, 3 in
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:21 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:21 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true} v 0) v1
Dec 01 09:14:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Dec 01 09:14:21 compute-0 ceph-osd[88047]: osd.0 12 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 01 09:14:21 compute-0 ceph-osd[88047]: osd.0 12 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Dec 01 09:14:21 compute-0 ceph-osd[88047]: osd.0 12 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 01 09:14:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-1a7370faf47ecc46d2f5cdc94c3e09cfe3b4487f61929d197202eb934212d00a-merged.mount: Deactivated successfully.
Dec 01 09:14:21 compute-0 podman[89934]: 2025-12-01 09:14:21.9771448 +0000 UTC m=+3.554212820 container remove 0a1791e946a9ed36364ab7a161adfca0762642f0ca98bc892e1b25a475673bb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2-activate, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:22 compute-0 podman[90146]: 2025-12-01 09:14:22.240154997 +0000 UTC m=+0.086434685 container create b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:22 compute-0 podman[90146]: 2025-12-01 09:14:22.187692118 +0000 UTC m=+0.033971836 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:22 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecd993e200d938a106ab467d743f0b19b8be9b6c706b9ccce917fda3605d2911/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:22 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:22 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:22 compute-0 podman[90146]: 2025-12-01 09:14:22.372253414 +0000 UTC m=+0.218533102 container init b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:14:22 compute-0 podman[90146]: 2025-12-01 09:14:22.383420065 +0000 UTC m=+0.229699753 container start b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:14:22 compute-0 bash[90146]: b8cc745a821772e0e0fff3e042e1907ba7c258782eed74884fe6924e85cc9329
Dec 01 09:14:22 compute-0 systemd[1]: Started Ceph osd.2 for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:14:22 compute-0 ceph-osd[90166]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:14:22 compute-0 ceph-osd[90166]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Dec 01 09:14:22 compute-0 ceph-osd[90166]: pidfile_write: ignore empty --pid-file
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d6751800 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d5919800 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:14:22 compute-0 sudo[89142]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:22 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:22 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:22 compute-0 sudo[90181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:22 compute-0 sudo[90181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:22 compute-0 sudo[90181]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:22 compute-0 sudo[90206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:22 compute-0 sudo[90206]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:22 compute-0 sudo[90206]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:22 compute-0 ceph-osd[90166]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Dec 01 09:14:22 compute-0 ceph-osd[90166]: load: jerasure load: lrc 
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:14:22 compute-0 sudo[90231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:22 compute-0 sudo[90231]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:22 compute-0 sudo[90231]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e12 do_prune osdmap full prune enabled
Dec 01 09:14:22 compute-0 sudo[90261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:14:22 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e13 e13: 3 total, 1 up, 3 in
Dec 01 09:14:22 compute-0 sudo[90261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:22 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e13: 3 total, 1 up, 3 in
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:22 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:22 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:22 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:22 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:22 compute-0 ceph-mon[75031]: pgmap v36: 0 pgs: ; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Dec 01 09:14:22 compute-0 ceph-mon[75031]: osdmap e12: 3 total, 1 up, 3 in
Dec 01 09:14:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Dec 01 09:14:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e13 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:22 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v39: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:22 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:14:23 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:23 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:23 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 01 09:14:23 compute-0 ceph-osd[90166]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d2c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs mount
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs mount shared_bdev_used = 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Git sha 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: DB SUMMARY
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: DB Session ID:  L5YAHCCF2C03Q9KM4AHV
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                     Options.env: 0x5595d67a3c70
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                Options.info_log: 0x5595d59a08a0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.write_buffer_manager: 0x5595d68ac460
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.row_cache: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                              Options.wal_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.wal_compression: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Compression algorithms supported:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kZSTD supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 podman[90330]: 2025-12-01 09:14:23.466277002 +0000 UTC m=+0.104371462 container create 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a02c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a0240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a0240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d59a0240)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 7e93ea91-2f6f-4eb5-a54b-eec339cd63ea
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580463475148, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580463475381, "job": 1, "event": "recovery_finished"}
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: freelist init
Dec 01 09:14:23 compute-0 ceph-osd[90166]: freelist _read_cfg
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs umount
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:14:23 compute-0 podman[90330]: 2025-12-01 09:14:23.418476125 +0000 UTC m=+0.056570595 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:23 compute-0 systemd[1]: Started libpod-conmon-2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3.scope.
Dec 01 09:14:23 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:23 compute-0 podman[90330]: 2025-12-01 09:14:23.665188066 +0000 UTC m=+0.303282536 container init 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:14:23 compute-0 podman[90330]: 2025-12-01 09:14:23.682811893 +0000 UTC m=+0.320906343 container start 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:14:23 compute-0 flamboyant_hermann[90541]: 167 167
Dec 01 09:14:23 compute-0 systemd[1]: libpod-2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3.scope: Deactivated successfully.
Dec 01 09:14:23 compute-0 podman[90330]: 2025-12-01 09:14:23.711220639 +0000 UTC m=+0.349315079 container attach 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:23 compute-0 podman[90330]: 2025-12-01 09:14:23.712121236 +0000 UTC m=+0.350215686 container died 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:14:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-b0b139207923be2aa8c01b05508dc094c69253c7ceb0cf145b4525d40ef8c6d5-merged.mount: Deactivated successfully.
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bdev(0x5595d67d3400 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs mount
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluefs mount shared_bdev_used = 4718592
Dec 01 09:14:23 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Git sha 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Compile date 2025-05-06 23:30:25
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: DB SUMMARY
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: DB Session ID:  L5YAHCCF2C03Q9KM4AHU
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                     Options.env: 0x5595d6954380
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                Options.info_log: 0x5595d5997280
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.write_buffer_manager: 0x5595d68ac6e0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.row_cache: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                              Options.wal_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.wal_compression: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Compression algorithms supported:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kZSTD supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5973c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d1f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5997840)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5997840)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:           Options.merge_operator: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5595d5997840)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5595d598d090
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.compression: LZ4
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.num_levels: 7
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:23 compute-0 podman[90330]: 2025-12-01 09:14:23.897366363 +0000 UTC m=+0.535460813 container remove 2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hermann, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:23 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Dec 01 09:14:23 compute-0 ceph-mon[75031]: osdmap e13: 3 total, 1 up, 3 in
Dec 01 09:14:23 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:23 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:23 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:14:23 compute-0 systemd[1]: libpod-conmon-2c15dd4672432835a4dce833152eb17d2cba360a0ecccb29a71cffe76126e9c3.scope: Deactivated successfully.
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 7e93ea91-2f6f-4eb5-a54b-eec339cd63ea
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580463972890, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:14:23 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464007944, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580463, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "7e93ea91-2f6f-4eb5-a54b-eec339cd63ea", "db_session_id": "L5YAHCCF2C03Q9KM4AHU", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464062807, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580464, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "7e93ea91-2f6f-4eb5-a54b-eec339cd63ea", "db_session_id": "L5YAHCCF2C03Q9KM4AHU", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464067308, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580464, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "7e93ea91-2f6f-4eb5-a54b-eec339cd63ea", "db_session_id": "L5YAHCCF2C03Q9KM4AHU", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580464068949, "job": 1, "event": "recovery_finished"}
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 01 09:14:24 compute-0 podman[90747]: 2025-12-01 09:14:24.161484774 +0000 UTC m=+0.075096720 container create 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5595d5afbc00
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: DB pointer 0x5595d6895a00
Dec 01 09:14:24 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:14:24 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Dec 01 09:14:24 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:14:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.4 total, 0.4 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:14:24 compute-0 ceph-osd[90166]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 01 09:14:24 compute-0 ceph-osd[90166]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 01 09:14:24 compute-0 ceph-osd[90166]: _get_class not permitted to load lua
Dec 01 09:14:24 compute-0 ceph-osd[90166]: _get_class not permitted to load sdk
Dec 01 09:14:24 compute-0 ceph-osd[90166]: _get_class not permitted to load test_remote_reads
Dec 01 09:14:24 compute-0 ceph-osd[90166]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 01 09:14:24 compute-0 ceph-osd[90166]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 01 09:14:24 compute-0 ceph-osd[90166]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 01 09:14:24 compute-0 ceph-osd[90166]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 01 09:14:24 compute-0 ceph-osd[90166]: osd.2 0 load_pgs
Dec 01 09:14:24 compute-0 ceph-osd[90166]: osd.2 0 load_pgs opened 0 pgs
Dec 01 09:14:24 compute-0 ceph-osd[90166]: osd.2 0 log_to_monitors true
Dec 01 09:14:24 compute-0 podman[90747]: 2025-12-01 09:14:24.116513673 +0000 UTC m=+0.030125659 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:24 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2[90162]: 2025-12-01T09:14:24.176+0000 7f122005e740 -1 osd.2 0 log_to_monitors true
Dec 01 09:14:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Dec 01 09:14:24 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec 01 09:14:24 compute-0 systemd[1]: Started libpod-conmon-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope.
Dec 01 09:14:24 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:24 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:24 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:24 compute-0 podman[90747]: 2025-12-01 09:14:24.383852612 +0000 UTC m=+0.297464608 container init 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:14:24 compute-0 podman[90747]: 2025-12-01 09:14:24.395789826 +0000 UTC m=+0.309401812 container start 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default)
Dec 01 09:14:24 compute-0 podman[90747]: 2025-12-01 09:14:24.399725796 +0000 UTC m=+0.313337782 container attach 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e13 do_prune osdmap full prune enabled
Dec 01 09:14:24 compute-0 ceph-mon[75031]: pgmap v39: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:24 compute-0 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec 01 09:14:24 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:24 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec 01 09:14:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e14 e14: 3 total, 1 up, 3 in
Dec 01 09:14:24 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e14: 3 total, 1 up, 3 in
Dec 01 09:14:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0) v1
Dec 01 09:14:24 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec 01 09:14:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e14 create-or-move crush item name 'osd.2' initial_weight 0.0195 at location {host=compute-0,root=default}
Dec 01 09:14:24 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v40: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:25 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:25 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:25 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 01 09:14:25 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 01 09:14:25 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:25 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e14 do_prune osdmap full prune enabled
Dec 01 09:14:25 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 01 09:14:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e15 e15: 3 total, 1 up, 3 in
Dec 01 09:14:25 compute-0 ceph-osd[90166]: osd.2 0 done with init, starting boot process
Dec 01 09:14:25 compute-0 ceph-osd[90166]: osd.2 0 start_boot
Dec 01 09:14:25 compute-0 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 01 09:14:25 compute-0 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 01 09:14:25 compute-0 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 01 09:14:25 compute-0 ceph-osd[90166]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 01 09:14:25 compute-0 ceph-osd[90166]: osd.2 0  bench count 12288000 bsize 4 KiB
Dec 01 09:14:25 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e15: 3 total, 1 up, 3 in
Dec 01 09:14:25 compute-0 great_stonebraker[90796]: {
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "osd_id": 0,
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "type": "bluestore"
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:     },
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "osd_id": 1,
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "type": "bluestore"
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:     },
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "osd_id": 2,
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:         "type": "bluestore"
Dec 01 09:14:25 compute-0 great_stonebraker[90796]:     }
Dec 01 09:14:25 compute-0 great_stonebraker[90796]: }
Dec 01 09:14:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:25 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:25 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:26 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:26 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:26 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:26 compute-0 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec 01 09:14:26 compute-0 ceph-mon[75031]: osdmap e14: 3 total, 1 up, 3 in
Dec 01 09:14:26 compute-0 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Dec 01 09:14:26 compute-0 ceph-mon[75031]: pgmap v40: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:26 compute-0 systemd[1]: libpod-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope: Deactivated successfully.
Dec 01 09:14:26 compute-0 systemd[1]: libpod-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope: Consumed 1.634s CPU time.
Dec 01 09:14:26 compute-0 podman[90747]: 2025-12-01 09:14:26.023669266 +0000 UTC m=+1.937281252 container died 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 10.849 iops: 2777.442 elapsed_sec: 1.080
Dec 01 09:14:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [WRN] : OSD bench result of 2777.441874 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 0 waiting for initial osdmap
Dec 01 09:14:26 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1[89048]: 2025-12-01T09:14:26.184+0000 7f11a1d9b640 -1 osd.1 0 waiting for initial osdmap
Dec 01 09:14:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca3153f023110b042fab684674e4433ee376f42db6be1bffa82ccc6d35957b76-merged.mount: Deactivated successfully.
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 15 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 15 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 15 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 15 check_osdmap_features require_osd_release unknown -> reef
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 15 set_numa_affinity not setting numa affinity
Dec 01 09:14:26 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-1[89048]: 2025-12-01T09:14:26.240+0000 7f119d3c3640 -1 osd.1 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:14:26 compute-0 ceph-osd[89052]: osd.1 15 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Dec 01 09:14:26 compute-0 podman[90747]: 2025-12-01 09:14:26.266601191 +0000 UTC m=+2.180213137 container remove 3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_stonebraker, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:14:26 compute-0 systemd[1]: libpod-conmon-3abf7fecd204be53f513dfa3e5c77f23f757d07dd2ecad5fae269de8a32b3cd6.scope: Deactivated successfully.
Dec 01 09:14:26 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/1623497857; not ready for session (expect reconnect)
Dec 01 09:14:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:26 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:26 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 01 09:14:26 compute-0 sudo[90261]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:26 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:26 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:26 compute-0 sudo[90844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:26 compute-0 sudo[90844]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:26 compute-0 sudo[90844]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:26 compute-0 sudo[90869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:14:26 compute-0 sudo[90869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:26 compute-0 sudo[90869]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:26 compute-0 sudo[90894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:26 compute-0 sudo[90894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:26 compute-0 sudo[90894]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:26 compute-0 sudo[90919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:26 compute-0 sudo[90919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:26 compute-0 sudo[90919]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:26 compute-0 sudo[90944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:26 compute-0 sudo[90944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:26 compute-0 sudo[90944]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:26 compute-0 sudo[90969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:14:26 compute-0 sudo[90969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:26 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v43: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:27 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e15 do_prune osdmap full prune enabled
Dec 01 09:14:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:27 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e16 e16: 3 total, 2 up, 3 in
Dec 01 09:14:27 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857] boot
Dec 01 09:14:27 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e16: 3 total, 2 up, 3 in
Dec 01 09:14:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) v1
Dec 01 09:14:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:27 compute-0 ceph-osd[89052]: osd.1 16 state: booting -> active
Dec 01 09:14:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=16) [1] r=0 lpr=16 pi=[12,16)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:14:27 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:27 compute-0 ceph-mon[75031]: from='osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 01 09:14:27 compute-0 ceph-mon[75031]: osdmap e15: 3 total, 1 up, 3 in
Dec 01 09:14:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:27 compute-0 ceph-mon[75031]: OSD bench result of 2777.441874 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:14:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:27 compute-0 podman[91066]: 2025-12-01 09:14:27.473922443 +0000 UTC m=+0.141761562 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:14:27 compute-0 podman[91066]: 2025-12-01 09:14:27.642830582 +0000 UTC m=+0.310669681 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:14:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:28 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:28 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e16 do_prune osdmap full prune enabled
Dec 01 09:14:28 compute-0 ceph-mon[75031]: purged_snaps scrub starts
Dec 01 09:14:28 compute-0 ceph-mon[75031]: purged_snaps scrub ok
Dec 01 09:14:28 compute-0 ceph-mon[75031]: pgmap v43: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 01 09:14:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:28 compute-0 ceph-mon[75031]: osd.1 [v2:192.168.122.100:6806/1623497857,v1:192.168.122.100:6807/1623497857] boot
Dec 01 09:14:28 compute-0 ceph-mon[75031]: osdmap e16: 3 total, 2 up, 3 in
Dec 01 09:14:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:14:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e17 e17: 3 total, 2 up, 3 in
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e17: 3 total, 2 up, 3 in
Dec 01 09:14:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:28 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 17 pg[1.0( empty local-lis/les=16/17 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=16) [1] r=0 lpr=16 pi=[12,16)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:14:28 compute-0 ceph-mgr[75324]: [devicehealth INFO root] creating main.db for devicehealth
Dec 01 09:14:28 compute-0 sudo[90969]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:28 compute-0 ceph-mgr[75324]: [devicehealth INFO root] Check health
Dec 01 09:14:28 compute-0 ceph-mgr[75324]: [devicehealth ERROR root] Fail to parse JSON result from daemon osd.2 ()
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Dec 01 09:14:28 compute-0 sudo[91198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:28 compute-0 sudo[91198]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:28 compute-0 sudo[91198]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:28 compute-0 sudo[91223]:     ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda
Dec 01 09:14:28 compute-0 sudo[91223]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 09:14:28 compute-0 sudo[91223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167)
Dec 01 09:14:28 compute-0 sudo[91223]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Dec 01 09:14:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0) v1
Dec 01 09:14:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:14:28 compute-0 sudo[91225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:28 compute-0 sudo[91225]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:28 compute-0 sudo[91225]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:28 compute-0 sudo[91252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:28 compute-0 sudo[91252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:28 compute-0 sudo[91252]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v46: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 01 09:14:29 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:29 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:29 compute-0 sudo[91277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:14:29 compute-0 sudo[91277]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e17 do_prune osdmap full prune enabled
Dec 01 09:14:29 compute-0 ceph-mon[75031]: osdmap e17: 3 total, 2 up, 3 in
Dec 01 09:14:29 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:29 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:29 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:29 compute-0 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Dec 01 09:14:29 compute-0 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Dec 01 09:14:29 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:14:29 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e18 e18: 3 total, 2 up, 3 in
Dec 01 09:14:29 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e18: 3 total, 2 up, 3 in
Dec 01 09:14:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:29 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:29 compute-0 sudo[91277]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:29 compute-0 sudo[91334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:29 compute-0 sudo[91334]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:29 compute-0 sudo[91334]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:29 compute-0 sudo[91359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:29 compute-0 sudo[91359]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:29 compute-0 sudo[91359]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:29 compute-0 sudo[91384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:29 compute-0 sudo[91384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:29 compute-0 sudo[91384]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:29 compute-0 sudo[91409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- inventory --format=json-pretty --filter-for-batch
Dec 01 09:14:29 compute-0 sudo[91409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:30 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:30 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:30 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:30 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mgrmap e9: compute-0.psduho(active, since 77s)
Dec 01 09:14:30 compute-0 ceph-mon[75031]: osdmap e18: 3 total, 2 up, 3 in
Dec 01 09:14:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:30 compute-0 podman[91472]: 2025-12-01 09:14:30.899959525 +0000 UTC m=+0.056476782 container create 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:14:30 compute-0 podman[91472]: 2025-12-01 09:14:30.874434597 +0000 UTC m=+0.030951854 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:30 compute-0 systemd[1]: Started libpod-conmon-2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6.scope.
Dec 01 09:14:31 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v48: 1 pgs: 1 creating+peering; 0 B data, 453 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:14:31 compute-0 podman[91472]: 2025-12-01 09:14:31.10484088 +0000 UTC m=+0.261358147 container init 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:31 compute-0 podman[91472]: 2025-12-01 09:14:31.115453204 +0000 UTC m=+0.271970441 container start 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:31 compute-0 exciting_raman[91489]: 167 167
Dec 01 09:14:31 compute-0 systemd[1]: libpod-2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6.scope: Deactivated successfully.
Dec 01 09:14:31 compute-0 podman[91472]: 2025-12-01 09:14:31.151041399 +0000 UTC m=+0.307558656 container attach 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:31 compute-0 podman[91472]: 2025-12-01 09:14:31.152978648 +0000 UTC m=+0.309495905 container died 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-9b646604e537e42241c5e222822e81cc09c9b2d587e58283103bef290df90daa-merged.mount: Deactivated successfully.
Dec 01 09:14:31 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:31 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:31 compute-0 podman[91472]: 2025-12-01 09:14:31.288678323 +0000 UTC m=+0.445195560 container remove 2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_raman, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:31 compute-0 systemd[1]: libpod-conmon-2715f1246d4a73121a5d6761e2bcf18f4125e078f2c584ee4aac5777463295f6.scope: Deactivated successfully.
Dec 01 09:14:31 compute-0 podman[91513]: 2025-12-01 09:14:31.603464738 +0000 UTC m=+0.048835698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:31 compute-0 podman[91513]: 2025-12-01 09:14:31.718920498 +0000 UTC m=+0.164291428 container create ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:14:31 compute-0 ceph-mon[75031]: pgmap v46: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 01 09:14:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:31 compute-0 ceph-mon[75031]: mgrmap e9: compute-0.psduho(active, since 77s)
Dec 01 09:14:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:31 compute-0 systemd[1]: Started libpod-conmon-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope.
Dec 01 09:14:31 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:31 compute-0 podman[91513]: 2025-12-01 09:14:31.833000455 +0000 UTC m=+0.278371405 container init ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:14:31 compute-0 podman[91513]: 2025-12-01 09:14:31.840420031 +0000 UTC m=+0.285790961 container start ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:14:31 compute-0 podman[91513]: 2025-12-01 09:14:31.858468532 +0000 UTC m=+0.303839552 container attach ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:14:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:32 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:32 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:32 compute-0 ceph-mon[75031]: pgmap v48: 1 pgs: 1 creating+peering; 0 B data, 453 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:14:32 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e18 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:14:33 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:33 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:33 compute-0 frosty_edison[91530]: [
Dec 01 09:14:33 compute-0 frosty_edison[91530]:     {
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "available": false,
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "ceph_device": false,
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "lsm_data": {},
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "lvs": [],
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "path": "/dev/sr0",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "rejected_reasons": [
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "Has a FileSystem",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "Insufficient space (<5GB)"
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         ],
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         "sys_api": {
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "actuators": null,
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "device_nodes": "sr0",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "devname": "sr0",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "human_readable_size": "482.00 KB",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "id_bus": "ata",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "model": "QEMU DVD-ROM",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "nr_requests": "2",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "parent": "/dev/sr0",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "partitions": {},
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "path": "/dev/sr0",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "removable": "1",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "rev": "2.5+",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "ro": "0",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "rotational": "1",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "sas_address": "",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "sas_device_handle": "",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "scheduler_mode": "mq-deadline",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "sectors": 0,
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "sectorsize": "2048",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "size": 493568.0,
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "support_discard": "2048",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "type": "disk",
Dec 01 09:14:33 compute-0 frosty_edison[91530]:             "vendor": "QEMU"
Dec 01 09:14:33 compute-0 frosty_edison[91530]:         }
Dec 01 09:14:33 compute-0 frosty_edison[91530]:     }
Dec 01 09:14:33 compute-0 frosty_edison[91530]: ]
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 16.769 iops: 4292.904 elapsed_sec: 0.699
Dec 01 09:14:33 compute-0 ceph-osd[90166]: log_channel(cluster) log [WRN] : OSD bench result of 4292.903722 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 0 waiting for initial osdmap
Dec 01 09:14:33 compute-0 systemd[1]: libpod-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope: Deactivated successfully.
Dec 01 09:14:33 compute-0 systemd[1]: libpod-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope: Consumed 2.074s CPU time.
Dec 01 09:14:33 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2[90162]: 2025-12-01T09:14:33.870+0000 7f121c7f5640 -1 osd.2 0 waiting for initial osdmap
Dec 01 09:14:33 compute-0 podman[91513]: 2025-12-01 09:14:33.874407081 +0000 UTC m=+2.319778011 container died ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 18 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 18 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 18 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 18 check_osdmap_features require_osd_release unknown -> reef
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 18 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 18 set_numa_affinity not setting numa affinity
Dec 01 09:14:33 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-osd-2[90162]: 2025-12-01T09:14:33.902+0000 7f1217606640 -1 osd.2 18 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:14:33 compute-0 ceph-osd[90166]: osd.2 18 _collect_metadata loop5:  no unique device id for loop5: fallback method has no model nor serial
Dec 01 09:14:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd7781993f4251c8648ab23bae0fc2a803ce88acc85341d052ed407a11cbb377-merged.mount: Deactivated successfully.
Dec 01 09:14:33 compute-0 podman[91513]: 2025-12-01 09:14:33.958825645 +0000 UTC m=+2.404196575 container remove ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_edison, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:14:33 compute-0 systemd[1]: libpod-conmon-ad89481bc39a9a5d369558c0d6fee9a8cb4092e702c459ce9f791e92b17bd620.scope: Deactivated successfully.
Dec 01 09:14:34 compute-0 sudo[91409]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: [cephadm INFO root] Adjusting osd_memory_target on compute-0 to 43690k
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on compute-0 to 43690k
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) v1
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on compute-0 to 44739242: error parsing value: Value '44739242' is below minimum 939524096
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on compute-0 to 44739242: error parsing value: Value '44739242' is below minimum 939524096
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 45e0131a-7f24-4bba-8ffd-dc6331f7ff73 does not exist
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 8365c79d-8bb7-4c64-a07e-6d2e276939d2 does not exist
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev fc0e2293-dca2-418f-be53-e200727213c6 does not exist
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:34 compute-0 sudo[93174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:34 compute-0 sudo[93174]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:34 compute-0 sudo[93174]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/3298763466; not ready for session (expect reconnect)
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mgr[75324]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 01 09:14:34 compute-0 sudo[93199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:34 compute-0 sudo[93199]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:34 compute-0 sudo[93199]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:34 compute-0 sudo[93224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:34 compute-0 sudo[93224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:34 compute-0 sudo[93224]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:34 compute-0 sudo[93249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:14:34 compute-0 sudo[93249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:34 compute-0 podman[93313]: 2025-12-01 09:14:34.781825591 +0000 UTC m=+0.042454595 container create 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e18 do_prune osdmap full prune enabled
Dec 01 09:14:34 compute-0 ceph-mon[75031]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:14:34 compute-0 ceph-mon[75031]: OSD bench result of 4292.903722 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:34 compute-0 systemd[1]: Started libpod-conmon-48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5.scope.
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e19 e19: 3 total, 3 up, 3 in
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466] boot
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e19: 3 total, 3 up, 3 in
Dec 01 09:14:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) v1
Dec 01 09:14:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:34 compute-0 ceph-osd[90166]: osd.2 19 state: booting -> active
Dec 01 09:14:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:34 compute-0 podman[93313]: 2025-12-01 09:14:34.761043798 +0000 UTC m=+0.021672792 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:34 compute-0 podman[93313]: 2025-12-01 09:14:34.86017178 +0000 UTC m=+0.120800764 container init 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:14:34 compute-0 podman[93313]: 2025-12-01 09:14:34.872613259 +0000 UTC m=+0.133242223 container start 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec 01 09:14:34 compute-0 podman[93313]: 2025-12-01 09:14:34.876664142 +0000 UTC m=+0.137293106 container attach 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:14:34 compute-0 lucid_banzai[93330]: 167 167
Dec 01 09:14:34 compute-0 systemd[1]: libpod-48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5.scope: Deactivated successfully.
Dec 01 09:14:34 compute-0 podman[93313]: 2025-12-01 09:14:34.879715735 +0000 UTC m=+0.140344699 container died 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:14:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-26b7bf6ff0c6be4409adb10fb6ebf9fd57a8c05e6ecd10a5bb73d02c3a8501cc-merged.mount: Deactivated successfully.
Dec 01 09:14:34 compute-0 podman[93313]: 2025-12-01 09:14:34.913523956 +0000 UTC m=+0.174152920 container remove 48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banzai, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:34 compute-0 systemd[1]: libpod-conmon-48cae05b2760c956482669b5d6beb6c24256b2359ee55e5423c0ec49b650fdf5.scope: Deactivated successfully.
Dec 01 09:14:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v51: 1 pgs: 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:14:35 compute-0 podman[93355]: 2025-12-01 09:14:35.064004983 +0000 UTC m=+0.033524753 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:35 compute-0 podman[93355]: 2025-12-01 09:14:35.429074079 +0000 UTC m=+0.398593859 container create 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:14:35 compute-0 systemd[1]: Started libpod-conmon-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope.
Dec 01 09:14:35 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e19 do_prune osdmap full prune enabled
Dec 01 09:14:35 compute-0 ceph-mon[75031]: Adjusting osd_memory_target on compute-0 to 43690k
Dec 01 09:14:35 compute-0 ceph-mon[75031]: Unable to set osd_memory_target on compute-0 to 44739242: error parsing value: Value '44739242' is below minimum 939524096
Dec 01 09:14:35 compute-0 ceph-mon[75031]: osd.2 [v2:192.168.122.100:6810/3298763466,v1:192.168.122.100:6811/3298763466] boot
Dec 01 09:14:35 compute-0 ceph-mon[75031]: osdmap e19: 3 total, 3 up, 3 in
Dec 01 09:14:35 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:14:35 compute-0 podman[93355]: 2025-12-01 09:14:35.898599092 +0000 UTC m=+0.868118942 container init 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:14:35 compute-0 podman[93355]: 2025-12-01 09:14:35.909879836 +0000 UTC m=+0.879399616 container start 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:14:36 compute-0 podman[93355]: 2025-12-01 09:14:36.16820844 +0000 UTC m=+1.137728210 container attach 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:14:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 e20: 3 total, 3 up, 3 in
Dec 01 09:14:36 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e20: 3 total, 3 up, 3 in
Dec 01 09:14:36 compute-0 ceph-mon[75031]: pgmap v51: 1 pgs: 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:14:36 compute-0 ceph-mon[75031]: osdmap e20: 3 total, 3 up, 3 in
Dec 01 09:14:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 880 MiB used, 59 GiB / 60 GiB avail
Dec 01 09:14:37 compute-0 festive_zhukovsky[93372]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:14:37 compute-0 festive_zhukovsky[93372]: --> relative data size: 1.0
Dec 01 09:14:37 compute-0 festive_zhukovsky[93372]: --> All data devices are unavailable
Dec 01 09:14:37 compute-0 systemd[1]: libpod-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope: Deactivated successfully.
Dec 01 09:14:37 compute-0 systemd[1]: libpod-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope: Consumed 1.237s CPU time.
Dec 01 09:14:37 compute-0 podman[93355]: 2025-12-01 09:14:37.197403203 +0000 UTC m=+2.166922973 container died 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 01 09:14:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ec28d90325cfbfc44affa0ec33d64c9e2854cbcdf2ea17b7f776acbc5be2361-merged.mount: Deactivated successfully.
Dec 01 09:14:37 compute-0 podman[93355]: 2025-12-01 09:14:37.260549287 +0000 UTC m=+2.230069057 container remove 345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_zhukovsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:14:37 compute-0 systemd[1]: libpod-conmon-345171687dc072aad73d06f26abeafcd4d0dbd53e59aec612f1578d0ee00019a.scope: Deactivated successfully.
Dec 01 09:14:37 compute-0 sudo[93249]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:37 compute-0 sudo[93414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:37 compute-0 sudo[93414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:37 compute-0 sudo[93414]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:37 compute-0 sudo[93439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:37 compute-0 sudo[93439]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:37 compute-0 sudo[93439]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:37 compute-0 sudo[93464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:37 compute-0 sudo[93464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:37 compute-0 sudo[93464]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:37 compute-0 sudo[93489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:14:37 compute-0 sudo[93489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:37 compute-0 podman[93552]: 2025-12-01 09:14:37.97481632 +0000 UTC m=+0.041046822 container create 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:14:38 compute-0 systemd[1]: Started libpod-conmon-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope.
Dec 01 09:14:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:38 compute-0 podman[93552]: 2025-12-01 09:14:37.957151181 +0000 UTC m=+0.023381703 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:38 compute-0 podman[93552]: 2025-12-01 09:14:38.079890883 +0000 UTC m=+0.146121405 container init 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:38 compute-0 podman[93552]: 2025-12-01 09:14:38.092837258 +0000 UTC m=+0.159067770 container start 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:14:38 compute-0 podman[93552]: 2025-12-01 09:14:38.097164329 +0000 UTC m=+0.163394871 container attach 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:14:38 compute-0 systemd[1]: libpod-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope: Deactivated successfully.
Dec 01 09:14:38 compute-0 strange_shaw[93569]: 167 167
Dec 01 09:14:38 compute-0 podman[93552]: 2025-12-01 09:14:38.101829842 +0000 UTC m=+0.168060374 container died 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:38 compute-0 conmon[93569]: conmon 5b298c80e1b4c7f21471 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope/container/memory.events
Dec 01 09:14:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-64098f3859646499d6812202bdf0b27619c95358a26a447d3b385b5f776aff38-merged.mount: Deactivated successfully.
Dec 01 09:14:38 compute-0 podman[93552]: 2025-12-01 09:14:38.167430331 +0000 UTC m=+0.233660833 container remove 5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_shaw, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:38 compute-0 systemd[1]: libpod-conmon-5b298c80e1b4c7f214717e228a3baab0f08f05aa919bd2f47ed5ee43712e9286.scope: Deactivated successfully.
Dec 01 09:14:38 compute-0 podman[93591]: 2025-12-01 09:14:38.366994165 +0000 UTC m=+0.058954328 container create bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:38 compute-0 systemd[1]: Started libpod-conmon-bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65.scope.
Dec 01 09:14:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:38 compute-0 podman[93591]: 2025-12-01 09:14:38.349192982 +0000 UTC m=+0.041153155 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:38 compute-0 podman[93591]: 2025-12-01 09:14:38.470920943 +0000 UTC m=+0.162881176 container init bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:38 compute-0 podman[93591]: 2025-12-01 09:14:38.479400491 +0000 UTC m=+0.171360634 container start bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:38 compute-0 podman[93591]: 2025-12-01 09:14:38.483435454 +0000 UTC m=+0.175395617 container attach bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:14:38 compute-0 ceph-mon[75031]: pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 880 MiB used, 59 GiB / 60 GiB avail
Dec 01 09:14:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:39 compute-0 loving_wu[93608]: {
Dec 01 09:14:39 compute-0 loving_wu[93608]:     "0": [
Dec 01 09:14:39 compute-0 loving_wu[93608]:         {
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "devices": [
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "/dev/loop3"
Dec 01 09:14:39 compute-0 loving_wu[93608]:             ],
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_name": "ceph_lv0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_size": "21470642176",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "name": "ceph_lv0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "tags": {
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.crush_device_class": "",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.encrypted": "0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osd_id": "0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.type": "block",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.vdo": "0"
Dec 01 09:14:39 compute-0 loving_wu[93608]:             },
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "type": "block",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "vg_name": "ceph_vg0"
Dec 01 09:14:39 compute-0 loving_wu[93608]:         }
Dec 01 09:14:39 compute-0 loving_wu[93608]:     ],
Dec 01 09:14:39 compute-0 loving_wu[93608]:     "1": [
Dec 01 09:14:39 compute-0 loving_wu[93608]:         {
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "devices": [
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "/dev/loop4"
Dec 01 09:14:39 compute-0 loving_wu[93608]:             ],
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_name": "ceph_lv1",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_size": "21470642176",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "name": "ceph_lv1",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "tags": {
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.crush_device_class": "",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.encrypted": "0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osd_id": "1",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.type": "block",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.vdo": "0"
Dec 01 09:14:39 compute-0 loving_wu[93608]:             },
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "type": "block",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "vg_name": "ceph_vg1"
Dec 01 09:14:39 compute-0 loving_wu[93608]:         }
Dec 01 09:14:39 compute-0 loving_wu[93608]:     ],
Dec 01 09:14:39 compute-0 loving_wu[93608]:     "2": [
Dec 01 09:14:39 compute-0 loving_wu[93608]:         {
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "devices": [
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "/dev/loop5"
Dec 01 09:14:39 compute-0 loving_wu[93608]:             ],
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_name": "ceph_lv2",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_size": "21470642176",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "name": "ceph_lv2",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "tags": {
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.crush_device_class": "",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.encrypted": "0",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osd_id": "2",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.type": "block",
Dec 01 09:14:39 compute-0 loving_wu[93608]:                 "ceph.vdo": "0"
Dec 01 09:14:39 compute-0 loving_wu[93608]:             },
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "type": "block",
Dec 01 09:14:39 compute-0 loving_wu[93608]:             "vg_name": "ceph_vg2"
Dec 01 09:14:39 compute-0 loving_wu[93608]:         }
Dec 01 09:14:39 compute-0 loving_wu[93608]:     ]
Dec 01 09:14:39 compute-0 loving_wu[93608]: }
Dec 01 09:14:39 compute-0 systemd[1]: libpod-bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65.scope: Deactivated successfully.
Dec 01 09:14:39 compute-0 podman[93591]: 2025-12-01 09:14:39.301371256 +0000 UTC m=+0.993331449 container died bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:14:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-11e2c741073363099782bf8de6d1feed77cb185a8914f94075e5f9cbc5a38d09-merged.mount: Deactivated successfully.
Dec 01 09:14:39 compute-0 podman[93591]: 2025-12-01 09:14:39.362473048 +0000 UTC m=+1.054433191 container remove bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_wu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:39 compute-0 systemd[1]: libpod-conmon-bb0949d355cfaae4b88f0116caf26edc78723fa38af240b43d4eed295fc33e65.scope: Deactivated successfully.
Dec 01 09:14:39 compute-0 sudo[93489]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:39 compute-0 sudo[93630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:39 compute-0 sudo[93630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:39 compute-0 sudo[93630]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:39 compute-0 sudo[93655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:39 compute-0 sudo[93655]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:39 compute-0 sudo[93655]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:39 compute-0 sudo[93680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:39 compute-0 sudo[93680]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:39 compute-0 sudo[93680]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:39 compute-0 sudo[93705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:14:39 compute-0 sudo[93705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:39 compute-0 podman[93771]: 2025-12-01 09:14:39.997079012 +0000 UTC m=+0.045468917 container create 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec 01 09:14:40 compute-0 systemd[1]: Started libpod-conmon-3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45.scope.
Dec 01 09:14:40 compute-0 podman[93771]: 2025-12-01 09:14:39.975576576 +0000 UTC m=+0.023966501 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:40 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:40 compute-0 podman[93771]: 2025-12-01 09:14:40.150660333 +0000 UTC m=+0.199050258 container init 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:40 compute-0 podman[93771]: 2025-12-01 09:14:40.157911974 +0000 UTC m=+0.206301879 container start 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:40 compute-0 modest_lichterman[93787]: 167 167
Dec 01 09:14:40 compute-0 systemd[1]: libpod-3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45.scope: Deactivated successfully.
Dec 01 09:14:40 compute-0 podman[93771]: 2025-12-01 09:14:40.181278827 +0000 UTC m=+0.229668752 container attach 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec 01 09:14:40 compute-0 podman[93771]: 2025-12-01 09:14:40.182754091 +0000 UTC m=+0.231143996 container died 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:14:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-dc9d557c3e7ab3a9c1a53ad24f861f03eafa01e3cb0784eb63335d486fe5ee15-merged.mount: Deactivated successfully.
Dec 01 09:14:40 compute-0 podman[93771]: 2025-12-01 09:14:40.235578352 +0000 UTC m=+0.283968267 container remove 3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:14:40 compute-0 systemd[1]: libpod-conmon-3a99aff72ffb4c21a1d1a455a1e6535340434649fdf9f800d75becb00bfd2b45.scope: Deactivated successfully.
Dec 01 09:14:40 compute-0 podman[93810]: 2025-12-01 09:14:40.407275345 +0000 UTC m=+0.043771855 container create 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:40 compute-0 systemd[1]: Started libpod-conmon-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope.
Dec 01 09:14:40 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:40 compute-0 podman[93810]: 2025-12-01 09:14:40.386966626 +0000 UTC m=+0.023463166 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:40 compute-0 podman[93810]: 2025-12-01 09:14:40.509434399 +0000 UTC m=+0.145930949 container init 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:40 compute-0 podman[93810]: 2025-12-01 09:14:40.518144725 +0000 UTC m=+0.154641245 container start 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:40 compute-0 podman[93810]: 2025-12-01 09:14:40.521636511 +0000 UTC m=+0.158133031 container attach 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:14:40 compute-0 ceph-mon[75031]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]: {
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "osd_id": 0,
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "type": "bluestore"
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:     },
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "osd_id": 1,
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "type": "bluestore"
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:     },
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "osd_id": 2,
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:         "type": "bluestore"
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]:     }
Dec 01 09:14:41 compute-0 sleepy_mirzakhani[93826]: }
Dec 01 09:14:41 compute-0 systemd[1]: libpod-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope: Deactivated successfully.
Dec 01 09:14:41 compute-0 systemd[1]: libpod-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope: Consumed 1.106s CPU time.
Dec 01 09:14:41 compute-0 podman[93859]: 2025-12-01 09:14:41.653219773 +0000 UTC m=+0.027654264 container died 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-d49931cd36ff1158e6f37e52c4a9f1cb91440fd5272e29551561448793d19eba-merged.mount: Deactivated successfully.
Dec 01 09:14:41 compute-0 podman[93859]: 2025-12-01 09:14:41.721759342 +0000 UTC m=+0.096193823 container remove 7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_mirzakhani, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:14:41 compute-0 systemd[1]: libpod-conmon-7b30aae8077c01dae6ce6eba504ea23ed3f8c6c1104545a2efe83bf84946f609.scope: Deactivated successfully.
Dec 01 09:14:41 compute-0 sudo[93705]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:41 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:41 compute-0 sudo[93874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:41 compute-0 sudo[93874]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:41 compute-0 sudo[93874]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:41 compute-0 sudo[93899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:14:41 compute-0 sudo[93899]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:41 compute-0 sudo[93899]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/alertmanager/web_user}] v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/alertmanager/web_password}] v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/prometheus/web_user}] v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/prometheus/web_password}] v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring mon.compute-0 (unknown last config time)...
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring mon.compute-0 (unknown last config time)...
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.compute-0 on compute-0
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.compute-0 on compute-0
Dec 01 09:14:42 compute-0 sudo[93924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:42 compute-0 sudo[93924]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:42 compute-0 sudo[93924]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:42 compute-0 sudo[93949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:42 compute-0 sudo[93949]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:42 compute-0 sudo[93949]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:42 compute-0 sudo[93974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:42 compute-0 sudo[93974]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:42 compute-0 sudo[93974]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:42 compute-0 sudo[93999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:14:42 compute-0 sudo[93999]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:42 compute-0 podman[94038]: 2025-12-01 09:14:42.556132885 +0000 UTC m=+0.056305868 container create 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:14:42 compute-0 systemd[1]: Started libpod-conmon-56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2.scope.
Dec 01 09:14:42 compute-0 podman[94038]: 2025-12-01 09:14:42.526108089 +0000 UTC m=+0.026281072 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:42 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:42 compute-0 podman[94038]: 2025-12-01 09:14:42.651742429 +0000 UTC m=+0.151915412 container init 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:14:42 compute-0 podman[94038]: 2025-12-01 09:14:42.659958819 +0000 UTC m=+0.160131792 container start 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:42 compute-0 podman[94038]: 2025-12-01 09:14:42.663767706 +0000 UTC m=+0.163940679 container attach 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:14:42 compute-0 nervous_archimedes[94054]: 167 167
Dec 01 09:14:42 compute-0 systemd[1]: libpod-56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2.scope: Deactivated successfully.
Dec 01 09:14:42 compute-0 podman[94038]: 2025-12-01 09:14:42.666730666 +0000 UTC m=+0.166903639 container died 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-f0fbe01d42b978c2b801d704e3e62cdc60be826547d475f066d6a794afbb7220-merged.mount: Deactivated successfully.
Dec 01 09:14:42 compute-0 podman[94038]: 2025-12-01 09:14:42.70556851 +0000 UTC m=+0.205741483 container remove 56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_archimedes, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:42 compute-0 systemd[1]: libpod-conmon-56fe87a6e956dbc56c1ee518ce39bd13c0a5370bf0c1c34437dc0adecf1677e2.scope: Deactivated successfully.
Dec 01 09:14:42 compute-0 sudo[93999]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring mgr.compute-0.psduho (unknown last config time)...
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring mgr.compute-0.psduho (unknown last config time)...
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.compute-0.psduho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.psduho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.compute-0.psduho on compute-0
Dec 01 09:14:42 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.compute-0.psduho on compute-0
Dec 01 09:14:42 compute-0 ceph-mon[75031]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.psduho", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:14:42 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:42 compute-0 sudo[94073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:42 compute-0 sudo[94073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:42 compute-0 sudo[94073]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:42 compute-0 sudo[94098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:42 compute-0 sudo[94098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:42 compute-0 sudo[94098]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:43 compute-0 sudo[94123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:43 compute-0 sudo[94123]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:43 compute-0 sudo[94123]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:14:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:14:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:14:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:14:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:14:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:14:43 compute-0 sudo[94148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:14:43 compute-0 sudo[94148]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:43 compute-0 podman[94190]: 2025-12-01 09:14:43.391043175 +0000 UTC m=+0.042971021 container create 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:43 compute-0 systemd[1]: Started libpod-conmon-78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306.scope.
Dec 01 09:14:43 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:43 compute-0 podman[94190]: 2025-12-01 09:14:43.467195726 +0000 UTC m=+0.119123572 container init 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:14:43 compute-0 podman[94190]: 2025-12-01 09:14:43.370640233 +0000 UTC m=+0.022568079 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:43 compute-0 podman[94190]: 2025-12-01 09:14:43.472641642 +0000 UTC m=+0.124569468 container start 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:43 compute-0 podman[94190]: 2025-12-01 09:14:43.476168819 +0000 UTC m=+0.128096695 container attach 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:14:43 compute-0 great_bhabha[94206]: 167 167
Dec 01 09:14:43 compute-0 systemd[1]: libpod-78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306.scope: Deactivated successfully.
Dec 01 09:14:43 compute-0 podman[94190]: 2025-12-01 09:14:43.477990245 +0000 UTC m=+0.129918061 container died 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec 01 09:14:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-15a56f30a33de5aebfb742a67a74e3d246371ca9202990041651596b4aaa7310-merged.mount: Deactivated successfully.
Dec 01 09:14:43 compute-0 podman[94190]: 2025-12-01 09:14:43.51359117 +0000 UTC m=+0.165518996 container remove 78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_bhabha, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:43 compute-0 systemd[1]: libpod-conmon-78be4a4992fc02f2397ae969c18e86ae73bcc07f627fa08a7bb5e969c4331306.scope: Deactivated successfully.
Dec 01 09:14:43 compute-0 sudo[94148]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:43 compute-0 sudo[94224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:43 compute-0 sudo[94224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:43 compute-0 sudo[94224]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:43 compute-0 sudo[94249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:43 compute-0 sudo[94249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:43 compute-0 sudo[94249]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:43 compute-0 sudo[94274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:43 compute-0 sudo[94274]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:43 compute-0 sudo[94274]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:43 compute-0 ceph-mon[75031]: Reconfiguring mon.compute-0 (unknown last config time)...
Dec 01 09:14:43 compute-0 ceph-mon[75031]: Reconfiguring daemon mon.compute-0 on compute-0
Dec 01 09:14:43 compute-0 ceph-mon[75031]: Reconfiguring mgr.compute-0.psduho (unknown last config time)...
Dec 01 09:14:43 compute-0 ceph-mon[75031]: Reconfiguring daemon mgr.compute-0.psduho on compute-0
Dec 01 09:14:43 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:43 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:43 compute-0 sudo[94299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:14:43 compute-0 sudo[94299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:44 compute-0 podman[94397]: 2025-12-01 09:14:44.404056923 +0000 UTC m=+0.058150093 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:44 compute-0 podman[94397]: 2025-12-01 09:14:44.511709785 +0000 UTC m=+0.165802925 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:44 compute-0 ceph-mon[75031]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:44 compute-0 sudo[94299]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:14:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:45 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 6aaf94fd-d6b5-4da8-a92a-420f856e4c4a does not exist
Dec 01 09:14:45 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 12544f78-4fd1-42ca-a06c-5f8eefcb686f does not exist
Dec 01 09:14:45 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 5b1226f3-a7e3-4d6f-b4fa-3ba594acd496 does not exist
Dec 01 09:14:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:14:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:14:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:14:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:45 compute-0 sudo[94520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:45 compute-0 sudo[94520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:45 compute-0 sudo[94520]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:45 compute-0 sudo[94545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:45 compute-0 sudo[94545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:45 compute-0 sudo[94545]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:45 compute-0 sudo[94570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:45 compute-0 sudo[94570]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:45 compute-0 sudo[94570]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:45 compute-0 sudo[94595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:14:45 compute-0 sudo[94595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:45 compute-0 podman[94659]: 2025-12-01 09:14:45.628727314 +0000 UTC m=+0.054011908 container create 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:14:45 compute-0 systemd[1]: Started libpod-conmon-73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c.scope.
Dec 01 09:14:45 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:45 compute-0 podman[94659]: 2025-12-01 09:14:45.602610368 +0000 UTC m=+0.027895012 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:45 compute-0 podman[94659]: 2025-12-01 09:14:45.709842936 +0000 UTC m=+0.135127600 container init 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:14:45 compute-0 podman[94659]: 2025-12-01 09:14:45.720145671 +0000 UTC m=+0.145430255 container start 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:45 compute-0 agitated_hofstadter[94675]: 167 167
Dec 01 09:14:45 compute-0 systemd[1]: libpod-73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c.scope: Deactivated successfully.
Dec 01 09:14:45 compute-0 podman[94659]: 2025-12-01 09:14:45.724224335 +0000 UTC m=+0.149508949 container attach 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:14:45 compute-0 podman[94659]: 2025-12-01 09:14:45.724669718 +0000 UTC m=+0.149954302 container died 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-45270ed9d4e0f009d5cf4fad28871f7dad7f625def583b3653bf8e324096edf9-merged.mount: Deactivated successfully.
Dec 01 09:14:45 compute-0 podman[94659]: 2025-12-01 09:14:45.764566795 +0000 UTC m=+0.189851399 container remove 73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_hofstadter, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:45 compute-0 systemd[1]: libpod-conmon-73fc1fc51dad473d7c3c7b8bf4786abb9834fc0401becaf9e17de7c7c4fa695c.scope: Deactivated successfully.
Dec 01 09:14:45 compute-0 podman[94698]: 2025-12-01 09:14:45.994474811 +0000 UTC m=+0.115209971 container create c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:14:46 compute-0 podman[94698]: 2025-12-01 09:14:45.905067707 +0000 UTC m=+0.025802857 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:46 compute-0 ceph-mon[75031]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:14:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:14:46 compute-0 systemd[1]: Started libpod-conmon-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope.
Dec 01 09:14:46 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:46 compute-0 podman[94698]: 2025-12-01 09:14:46.189502247 +0000 UTC m=+0.310237427 container init c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default)
Dec 01 09:14:46 compute-0 podman[94698]: 2025-12-01 09:14:46.198054987 +0000 UTC m=+0.318790127 container start c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:46 compute-0 podman[94698]: 2025-12-01 09:14:46.201548904 +0000 UTC m=+0.322284124 container attach c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:47 compute-0 suspicious_curran[94714]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:14:47 compute-0 suspicious_curran[94714]: --> relative data size: 1.0
Dec 01 09:14:47 compute-0 suspicious_curran[94714]: --> All data devices are unavailable
Dec 01 09:14:47 compute-0 systemd[1]: libpod-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope: Deactivated successfully.
Dec 01 09:14:47 compute-0 systemd[1]: libpod-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope: Consumed 1.227s CPU time.
Dec 01 09:14:47 compute-0 podman[94698]: 2025-12-01 09:14:47.46315382 +0000 UTC m=+1.583888960 container died c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-20e61ba8de1fbeb5168aa88cff2ed4ee0816cb7774fc6eb6dcfa8fe3cc6e8a38-merged.mount: Deactivated successfully.
Dec 01 09:14:47 compute-0 podman[94698]: 2025-12-01 09:14:47.587583353 +0000 UTC m=+1.708318483 container remove c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_curran, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:14:47 compute-0 systemd[1]: libpod-conmon-c7dfae658683f9d0178e5db16bc5e2efe41e380ef02b3f36d24a043899738501.scope: Deactivated successfully.
Dec 01 09:14:47 compute-0 sudo[94595]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:47 compute-0 sudo[94758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:47 compute-0 sudo[94758]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:47 compute-0 sudo[94758]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:47 compute-0 sudo[94783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:47 compute-0 sudo[94783]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:47 compute-0 sudo[94783]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:47 compute-0 sudo[94808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:47 compute-0 sudo[94808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:47 compute-0 sudo[94808]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:47 compute-0 sudo[94833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:14:47 compute-0 sudo[94833]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:48 compute-0 ceph-mon[75031]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:48 compute-0 podman[94898]: 2025-12-01 09:14:48.277256506 +0000 UTC m=+0.055764011 container create b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:14:48 compute-0 systemd[1]: Started libpod-conmon-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope.
Dec 01 09:14:48 compute-0 podman[94898]: 2025-12-01 09:14:48.254551504 +0000 UTC m=+0.033059059 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:48 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:48 compute-0 podman[94898]: 2025-12-01 09:14:48.36891749 +0000 UTC m=+0.147425035 container init b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:48 compute-0 podman[94898]: 2025-12-01 09:14:48.37746418 +0000 UTC m=+0.155971685 container start b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:48 compute-0 podman[94898]: 2025-12-01 09:14:48.381617977 +0000 UTC m=+0.160125482 container attach b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:48 compute-0 unruffled_lichterman[94914]: 167 167
Dec 01 09:14:48 compute-0 systemd[1]: libpod-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope: Deactivated successfully.
Dec 01 09:14:48 compute-0 conmon[94914]: conmon b105126fc98498345ed8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope/container/memory.events
Dec 01 09:14:48 compute-0 podman[94898]: 2025-12-01 09:14:48.388135656 +0000 UTC m=+0.166643171 container died b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-d0d9fe6c1b686670527a99430fdefb67465b531d692cdf50f4dcac28916066e3-merged.mount: Deactivated successfully.
Dec 01 09:14:48 compute-0 podman[94898]: 2025-12-01 09:14:48.425262637 +0000 UTC m=+0.203770142 container remove b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_lichterman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:14:48 compute-0 systemd[1]: libpod-conmon-b105126fc98498345ed851fe871c967591ef0e0372db56d93498a1a6c7f3ab32.scope: Deactivated successfully.
Dec 01 09:14:48 compute-0 podman[94938]: 2025-12-01 09:14:48.583868712 +0000 UTC m=+0.037430902 container create 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:48 compute-0 systemd[1]: Started libpod-conmon-5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1.scope.
Dec 01 09:14:48 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:48 compute-0 podman[94938]: 2025-12-01 09:14:48.568689989 +0000 UTC m=+0.022252199 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:48 compute-0 podman[94938]: 2025-12-01 09:14:48.675618199 +0000 UTC m=+0.129180399 container init 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:14:48 compute-0 podman[94938]: 2025-12-01 09:14:48.688209052 +0000 UTC m=+0.141771252 container start 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:14:48 compute-0 podman[94938]: 2025-12-01 09:14:48.692080051 +0000 UTC m=+0.145642241 container attach 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:14:49 compute-0 sudo[94983]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdzfxrtqzqdptgilmittskbaekovvujp ; /usr/bin/python3'
Dec 01 09:14:49 compute-0 sudo[94983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:14:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:49 compute-0 python3[94985]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:14:49 compute-0 podman[94987]: 2025-12-01 09:14:49.267957265 +0000 UTC m=+0.060678571 container create 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:49 compute-0 systemd[1]: Started libpod-conmon-859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c.scope.
Dec 01 09:14:49 compute-0 podman[94987]: 2025-12-01 09:14:49.238580419 +0000 UTC m=+0.031301785 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:14:49 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:49 compute-0 podman[94987]: 2025-12-01 09:14:49.36163502 +0000 UTC m=+0.154356326 container init 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:14:49 compute-0 podman[94987]: 2025-12-01 09:14:49.36850914 +0000 UTC m=+0.161230446 container start 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:49 compute-0 podman[94987]: 2025-12-01 09:14:49.373060438 +0000 UTC m=+0.165781764 container attach 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:14:49 compute-0 great_hermann[94955]: {
Dec 01 09:14:49 compute-0 great_hermann[94955]:     "0": [
Dec 01 09:14:49 compute-0 great_hermann[94955]:         {
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "devices": [
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "/dev/loop3"
Dec 01 09:14:49 compute-0 great_hermann[94955]:             ],
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_name": "ceph_lv0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_size": "21470642176",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "name": "ceph_lv0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "tags": {
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.crush_device_class": "",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.encrypted": "0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osd_id": "0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.type": "block",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.vdo": "0"
Dec 01 09:14:49 compute-0 great_hermann[94955]:             },
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "type": "block",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "vg_name": "ceph_vg0"
Dec 01 09:14:49 compute-0 great_hermann[94955]:         }
Dec 01 09:14:49 compute-0 great_hermann[94955]:     ],
Dec 01 09:14:49 compute-0 great_hermann[94955]:     "1": [
Dec 01 09:14:49 compute-0 great_hermann[94955]:         {
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "devices": [
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "/dev/loop4"
Dec 01 09:14:49 compute-0 great_hermann[94955]:             ],
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_name": "ceph_lv1",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_size": "21470642176",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "name": "ceph_lv1",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "tags": {
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.crush_device_class": "",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.encrypted": "0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osd_id": "1",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.type": "block",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.vdo": "0"
Dec 01 09:14:49 compute-0 great_hermann[94955]:             },
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "type": "block",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "vg_name": "ceph_vg1"
Dec 01 09:14:49 compute-0 great_hermann[94955]:         }
Dec 01 09:14:49 compute-0 great_hermann[94955]:     ],
Dec 01 09:14:49 compute-0 great_hermann[94955]:     "2": [
Dec 01 09:14:49 compute-0 great_hermann[94955]:         {
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "devices": [
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "/dev/loop5"
Dec 01 09:14:49 compute-0 great_hermann[94955]:             ],
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_name": "ceph_lv2",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_size": "21470642176",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "name": "ceph_lv2",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "tags": {
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.cluster_name": "ceph",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.crush_device_class": "",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.encrypted": "0",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osd_id": "2",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.type": "block",
Dec 01 09:14:49 compute-0 great_hermann[94955]:                 "ceph.vdo": "0"
Dec 01 09:14:49 compute-0 great_hermann[94955]:             },
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "type": "block",
Dec 01 09:14:49 compute-0 great_hermann[94955]:             "vg_name": "ceph_vg2"
Dec 01 09:14:49 compute-0 great_hermann[94955]:         }
Dec 01 09:14:49 compute-0 great_hermann[94955]:     ]
Dec 01 09:14:49 compute-0 great_hermann[94955]: }
Dec 01 09:14:49 compute-0 systemd[1]: libpod-5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1.scope: Deactivated successfully.
Dec 01 09:14:49 compute-0 podman[94938]: 2025-12-01 09:14:49.538866692 +0000 UTC m=+0.992428902 container died 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:14:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-d11d224c2f29d195088492fa8fb5ca6c475aad7c5bb1b2263a519bcaf46b3a98-merged.mount: Deactivated successfully.
Dec 01 09:14:49 compute-0 podman[94938]: 2025-12-01 09:14:49.591258349 +0000 UTC m=+1.044820539 container remove 5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_hermann, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:49 compute-0 systemd[1]: libpod-conmon-5fb92ffa0cc868fe4466fb760e101114596ef286008303c4c98d55028bb656f1.scope: Deactivated successfully.
Dec 01 09:14:49 compute-0 sudo[94833]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:49 compute-0 sudo[95025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:49 compute-0 sudo[95025]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:49 compute-0 sudo[95025]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:49 compute-0 sudo[95060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:14:49 compute-0 sudo[95060]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:49 compute-0 sudo[95060]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:49 compute-0 sudo[95094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:49 compute-0 sudo[95094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:49 compute-0 sudo[95094]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:49 compute-0 sudo[95119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:14:49 compute-0 sudo[95119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec 01 09:14:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1076553885' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:14:49 compute-0 hardcore_elbakyan[95003]: 
Dec 01 09:14:49 compute-0 hardcore_elbakyan[95003]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":147,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":20,"num_osds":3,"num_up_osds":3,"osd_up_since":1764580474,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":1}],"num_pgs":1,"num_pools":1,"num_objects":2,"data_bytes":459280,"bytes_used":502837248,"bytes_avail":63909089280,"bytes_total":64411926528},"fsmap":{"epoch":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":2,"modified":"2025-12-01T09:14:14.959091+0000","services":{}},"progress_events":{}}
Dec 01 09:14:50 compute-0 systemd[1]: libpod-859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c.scope: Deactivated successfully.
Dec 01 09:14:50 compute-0 podman[94987]: 2025-12-01 09:14:50.014262383 +0000 UTC m=+0.806983699 container died 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-1f06b18828f78f6cfa8695665f31ff023fed42c080f6fb35970bc8cb2afa30d2-merged.mount: Deactivated successfully.
Dec 01 09:14:50 compute-0 podman[94987]: 2025-12-01 09:14:50.072371674 +0000 UTC m=+0.865093010 container remove 859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c (image=quay.io/ceph/ceph:v18, name=hardcore_elbakyan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:14:50 compute-0 systemd[1]: libpod-conmon-859d5b71cd9b9d0ca5fe43d6d46af4c9d9830b39f25fb23cdd2ade6fa2b4854c.scope: Deactivated successfully.
Dec 01 09:14:50 compute-0 sudo[94983]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:50 compute-0 ceph-mon[75031]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:50 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1076553885' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:14:50 compute-0 podman[95201]: 2025-12-01 09:14:50.197341713 +0000 UTC m=+0.037007169 container create 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:50 compute-0 systemd[1]: Started libpod-conmon-262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968.scope.
Dec 01 09:14:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:50 compute-0 podman[95201]: 2025-12-01 09:14:50.267673177 +0000 UTC m=+0.107338653 container init 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:14:50 compute-0 podman[95201]: 2025-12-01 09:14:50.274215247 +0000 UTC m=+0.113880713 container start 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:50 compute-0 podman[95201]: 2025-12-01 09:14:50.181516711 +0000 UTC m=+0.021182187 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:50 compute-0 podman[95201]: 2025-12-01 09:14:50.277469326 +0000 UTC m=+0.117134802 container attach 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:14:50 compute-0 goofy_khorana[95217]: 167 167
Dec 01 09:14:50 compute-0 systemd[1]: libpod-262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968.scope: Deactivated successfully.
Dec 01 09:14:50 compute-0 podman[95201]: 2025-12-01 09:14:50.278771165 +0000 UTC m=+0.118436621 container died 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-063b5f59300888895678bf4cc9ecb77d88e38787a81fdfd3cc791f080adbb036-merged.mount: Deactivated successfully.
Dec 01 09:14:50 compute-0 podman[95201]: 2025-12-01 09:14:50.312518904 +0000 UTC m=+0.152184360 container remove 262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec 01 09:14:50 compute-0 systemd[1]: libpod-conmon-262f2bb39b111e50b8bd4d310721c9eddcc92c9824e427a9cb23076033d91968.scope: Deactivated successfully.
Dec 01 09:14:50 compute-0 sudo[95257]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxwnylgmturgkohcxucphrbatuhzstil ; /usr/bin/python3'
Dec 01 09:14:50 compute-0 sudo[95257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:14:50 compute-0 podman[95265]: 2025-12-01 09:14:50.47148664 +0000 UTC m=+0.043214318 container create 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:14:50 compute-0 systemd[1]: Started libpod-conmon-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope.
Dec 01 09:14:50 compute-0 python3[95259]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create vms  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:14:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:50 compute-0 podman[95265]: 2025-12-01 09:14:50.453264184 +0000 UTC m=+0.024991882 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:14:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:50 compute-0 podman[95265]: 2025-12-01 09:14:50.567397883 +0000 UTC m=+0.139125581 container init 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:50 compute-0 podman[95265]: 2025-12-01 09:14:50.57450736 +0000 UTC m=+0.146235038 container start 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:50 compute-0 podman[95265]: 2025-12-01 09:14:50.579871084 +0000 UTC m=+0.151598782 container attach 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:50 compute-0 podman[95284]: 2025-12-01 09:14:50.588010752 +0000 UTC m=+0.043039133 container create fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:50 compute-0 systemd[1]: Started libpod-conmon-fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68.scope.
Dec 01 09:14:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:50 compute-0 podman[95284]: 2025-12-01 09:14:50.569199508 +0000 UTC m=+0.024227899 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:14:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e066a1ec59fc60d27d3ea1360a65001911a92d455ed84fba42bebfd09656328/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e066a1ec59fc60d27d3ea1360a65001911a92d455ed84fba42bebfd09656328/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:50 compute-0 podman[95284]: 2025-12-01 09:14:50.67718814 +0000 UTC m=+0.132216541 container init fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:14:50 compute-0 podman[95284]: 2025-12-01 09:14:50.685457962 +0000 UTC m=+0.140486343 container start fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:14:50 compute-0 podman[95284]: 2025-12-01 09:14:50.689209847 +0000 UTC m=+0.144238258 container attach fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:14:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:51 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec 01 09:14:51 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:51 compute-0 amazing_lamport[95282]: {
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "osd_id": 0,
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "type": "bluestore"
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:     },
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "osd_id": 1,
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "type": "bluestore"
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:     },
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "osd_id": 2,
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:         "type": "bluestore"
Dec 01 09:14:51 compute-0 amazing_lamport[95282]:     }
Dec 01 09:14:51 compute-0 amazing_lamport[95282]: }
Dec 01 09:14:51 compute-0 systemd[1]: libpod-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope: Deactivated successfully.
Dec 01 09:14:51 compute-0 systemd[1]: libpod-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope: Consumed 1.307s CPU time.
Dec 01 09:14:51 compute-0 podman[95265]: 2025-12-01 09:14:51.890456823 +0000 UTC m=+1.462184511 container died 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-493438c4c795cc66904661c5ff6827a8fa365df31329a35dc3777a38425c2dd4-merged.mount: Deactivated successfully.
Dec 01 09:14:51 compute-0 podman[95265]: 2025-12-01 09:14:51.93891241 +0000 UTC m=+1.510640088 container remove 87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=amazing_lamport, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:14:51 compute-0 systemd[1]: libpod-conmon-87bbf4bf21e908a92d6fcbc74d07a177ea44e981a7a2c981f1a9e577e5f8a1d5.scope: Deactivated successfully.
Dec 01 09:14:51 compute-0 sudo[95119]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:51 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:14:51 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:51 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:14:51 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:52 compute-0 sudo[95370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:14:52 compute-0 sudo[95370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:52 compute-0 sudo[95370]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:52 compute-0 sudo[95395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:14:52 compute-0 sudo[95395]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:14:52 compute-0 sudo[95395]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e20 do_prune osdmap full prune enabled
Dec 01 09:14:52 compute-0 ceph-mon[75031]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:52 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:14:52 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e21 e21: 3 total, 3 up, 3 in
Dec 01 09:14:52 compute-0 sleepy_gates[95302]: pool 'vms' created
Dec 01 09:14:52 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e21: 3 total, 3 up, 3 in
Dec 01 09:14:52 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 21 pg[2.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [2] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:14:52 compute-0 systemd[1]: libpod-fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68.scope: Deactivated successfully.
Dec 01 09:14:52 compute-0 podman[95284]: 2025-12-01 09:14:52.195616005 +0000 UTC m=+1.650644396 container died fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:14:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-0e066a1ec59fc60d27d3ea1360a65001911a92d455ed84fba42bebfd09656328-merged.mount: Deactivated successfully.
Dec 01 09:14:52 compute-0 podman[95284]: 2025-12-01 09:14:52.243491784 +0000 UTC m=+1.698520165 container remove fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68 (image=quay.io/ceph/ceph:v18, name=sleepy_gates, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:14:52 compute-0 systemd[1]: libpod-conmon-fe2df0efc438f303d7f9448deb8ecf271e5a6008a7a2e5591d3114e50bc57e68.scope: Deactivated successfully.
Dec 01 09:14:52 compute-0 sudo[95257]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:52 compute-0 sudo[95458]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahkmpnxwnhgwfpngsrlthsokswekdacl ; /usr/bin/python3'
Dec 01 09:14:52 compute-0 sudo[95458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:14:52 compute-0 python3[95460]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create volumes  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:14:52 compute-0 podman[95461]: 2025-12-01 09:14:52.626347855 +0000 UTC m=+0.044887300 container create f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:14:52 compute-0 systemd[1]: Started libpod-conmon-f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb.scope.
Dec 01 09:14:52 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/153f7faa2662baaae11662c450fe794fd8815f849dd49e3c9f1931ee97d069d7/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/153f7faa2662baaae11662c450fe794fd8815f849dd49e3c9f1931ee97d069d7/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:52 compute-0 podman[95461]: 2025-12-01 09:14:52.697143753 +0000 UTC m=+0.115683208 container init f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Dec 01 09:14:52 compute-0 podman[95461]: 2025-12-01 09:14:52.607910063 +0000 UTC m=+0.026449518 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:14:52 compute-0 podman[95461]: 2025-12-01 09:14:52.707399405 +0000 UTC m=+0.125938840 container start f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:14:52 compute-0 podman[95461]: 2025-12-01 09:14:52.710798569 +0000 UTC m=+0.129338014 container attach f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:14:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e21 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v62: 2 pgs: 1 active+clean, 1 unknown; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e21 do_prune osdmap full prune enabled
Dec 01 09:14:53 compute-0 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:14:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e22 e22: 3 total, 3 up, 3 in
Dec 01 09:14:53 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e22: 3 total, 3 up, 3 in
Dec 01 09:14:53 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4139565444' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:53 compute-0 ceph-mon[75031]: osdmap e21: 3 total, 3 up, 3 in
Dec 01 09:14:53 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 22 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [2] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:14:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec 01 09:14:53 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e22 do_prune osdmap full prune enabled
Dec 01 09:14:54 compute-0 ceph-mon[75031]: pgmap v62: 2 pgs: 1 active+clean, 1 unknown; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:54 compute-0 ceph-mon[75031]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:14:54 compute-0 ceph-mon[75031]: osdmap e22: 3 total, 3 up, 3 in
Dec 01 09:14:54 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:54 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e23 e23: 3 total, 3 up, 3 in
Dec 01 09:14:54 compute-0 epic_dewdney[95477]: pool 'volumes' created
Dec 01 09:14:54 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e23: 3 total, 3 up, 3 in
Dec 01 09:14:54 compute-0 systemd[1]: libpod-f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb.scope: Deactivated successfully.
Dec 01 09:14:54 compute-0 podman[95461]: 2025-12-01 09:14:54.230995287 +0000 UTC m=+1.649534762 container died f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-153f7faa2662baaae11662c450fe794fd8815f849dd49e3c9f1931ee97d069d7-merged.mount: Deactivated successfully.
Dec 01 09:14:54 compute-0 podman[95461]: 2025-12-01 09:14:54.281128035 +0000 UTC m=+1.699667470 container remove f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb (image=quay.io/ceph/ceph:v18, name=epic_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:14:54 compute-0 systemd[1]: libpod-conmon-f3570af510c51015446019015b8d4e95a89b60bfbbfdef70440440f55a6ee0bb.scope: Deactivated successfully.
Dec 01 09:14:54 compute-0 sudo[95458]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:54 compute-0 sudo[95536]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjramskmyxaquuwuvmtknnyfikgondq ; /usr/bin/python3'
Dec 01 09:14:54 compute-0 sudo[95536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:14:54 compute-0 python3[95538]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create backups  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:14:54 compute-0 podman[95539]: 2025-12-01 09:14:54.632992941 +0000 UTC m=+0.042183937 container create c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:14:54 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 23 pg[3.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [1] r=0 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:14:54 compute-0 systemd[1]: Started libpod-conmon-c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a.scope.
Dec 01 09:14:54 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa5d8d0787aa7ef8c40321ce0843b610a3337d9a2a44e6a532cf967ef0bc8355/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa5d8d0787aa7ef8c40321ce0843b610a3337d9a2a44e6a532cf967ef0bc8355/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:54 compute-0 podman[95539]: 2025-12-01 09:14:54.698883499 +0000 UTC m=+0.108074515 container init c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:54 compute-0 podman[95539]: 2025-12-01 09:14:54.705354656 +0000 UTC m=+0.114545652 container start c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:54 compute-0 podman[95539]: 2025-12-01 09:14:54.708609836 +0000 UTC m=+0.117800852 container attach c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:14:54 compute-0 podman[95539]: 2025-12-01 09:14:54.613908049 +0000 UTC m=+0.023099075 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:14:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v65: 3 pgs: 1 active+clean, 2 unknown; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e23 do_prune osdmap full prune enabled
Dec 01 09:14:55 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/60547159' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:55 compute-0 ceph-mon[75031]: osdmap e23: 3 total, 3 up, 3 in
Dec 01 09:14:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e24 e24: 3 total, 3 up, 3 in
Dec 01 09:14:55 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e24: 3 total, 3 up, 3 in
Dec 01 09:14:55 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 24 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [1] r=0 lpr=23 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:14:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec 01 09:14:55 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e24 do_prune osdmap full prune enabled
Dec 01 09:14:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e25 e25: 3 total, 3 up, 3 in
Dec 01 09:14:56 compute-0 vigorous_noether[95555]: pool 'backups' created
Dec 01 09:14:56 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e25: 3 total, 3 up, 3 in
Dec 01 09:14:56 compute-0 ceph-mon[75031]: pgmap v65: 3 pgs: 1 active+clean, 2 unknown; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:56 compute-0 ceph-mon[75031]: osdmap e24: 3 total, 3 up, 3 in
Dec 01 09:14:56 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:56 compute-0 systemd[1]: libpod-c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a.scope: Deactivated successfully.
Dec 01 09:14:56 compute-0 podman[95539]: 2025-12-01 09:14:56.240741679 +0000 UTC m=+1.649932675 container died c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:14:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-aa5d8d0787aa7ef8c40321ce0843b610a3337d9a2a44e6a532cf967ef0bc8355-merged.mount: Deactivated successfully.
Dec 01 09:14:56 compute-0 podman[95539]: 2025-12-01 09:14:56.285696469 +0000 UTC m=+1.694887465 container remove c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a (image=quay.io/ceph/ceph:v18, name=vigorous_noether, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:14:56 compute-0 systemd[1]: libpod-conmon-c120a378ce9151e27aac70f49da98cd53def5d57355f286b131d4912d23c255a.scope: Deactivated successfully.
Dec 01 09:14:56 compute-0 sudo[95536]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:56 compute-0 sudo[95617]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sayhdhtztolhsetwvwhoacbndrfjhbmr ; /usr/bin/python3'
Dec 01 09:14:56 compute-0 sudo[95617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:14:56 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 25 pg[4.0( empty local-lis/les=0/0 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [0] r=0 lpr=25 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:14:56 compute-0 python3[95619]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create images  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:14:56 compute-0 podman[95620]: 2025-12-01 09:14:56.631891542 +0000 UTC m=+0.050305215 container create 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:56 compute-0 systemd[1]: Started libpod-conmon-474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018.scope.
Dec 01 09:14:56 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1416c27369da903e47d370554a4469bd50572f8bdcac8fc960db13d06d605bdb/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1416c27369da903e47d370554a4469bd50572f8bdcac8fc960db13d06d605bdb/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:56 compute-0 podman[95620]: 2025-12-01 09:14:56.604622271 +0000 UTC m=+0.023036034 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:14:56 compute-0 podman[95620]: 2025-12-01 09:14:56.708179526 +0000 UTC m=+0.126593209 container init 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:14:56 compute-0 podman[95620]: 2025-12-01 09:14:56.71388829 +0000 UTC m=+0.132301963 container start 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:14:56 compute-0 podman[95620]: 2025-12-01 09:14:56.717188601 +0000 UTC m=+0.135602484 container attach 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v68: 4 pgs: 2 active+clean, 2 unknown; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e25 do_prune osdmap full prune enabled
Dec 01 09:14:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e26 e26: 3 total, 3 up, 3 in
Dec 01 09:14:57 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3505839965' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:57 compute-0 ceph-mon[75031]: osdmap e25: 3 total, 3 up, 3 in
Dec 01 09:14:57 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e26: 3 total, 3 up, 3 in
Dec 01 09:14:57 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 26 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [0] r=0 lpr=25 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:14:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec 01 09:14:57 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e26 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:14:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e26 do_prune osdmap full prune enabled
Dec 01 09:14:58 compute-0 ceph-mon[75031]: pgmap v68: 4 pgs: 2 active+clean, 2 unknown; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:58 compute-0 ceph-mon[75031]: osdmap e26: 3 total, 3 up, 3 in
Dec 01 09:14:58 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:14:58 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e27 e27: 3 total, 3 up, 3 in
Dec 01 09:14:58 compute-0 jovial_faraday[95635]: pool 'images' created
Dec 01 09:14:58 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e27: 3 total, 3 up, 3 in
Dec 01 09:14:58 compute-0 systemd[1]: libpod-474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018.scope: Deactivated successfully.
Dec 01 09:14:58 compute-0 podman[95620]: 2025-12-01 09:14:58.279782442 +0000 UTC m=+1.698196115 container died 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:14:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-1416c27369da903e47d370554a4469bd50572f8bdcac8fc960db13d06d605bdb-merged.mount: Deactivated successfully.
Dec 01 09:14:58 compute-0 podman[95620]: 2025-12-01 09:14:58.326393093 +0000 UTC m=+1.744806786 container remove 474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018 (image=quay.io/ceph/ceph:v18, name=jovial_faraday, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:58 compute-0 systemd[1]: libpod-conmon-474ebf9db0f7ae11bd7851746cda9dd745be30df9db8bc1f082d5c5d2dc5c018.scope: Deactivated successfully.
Dec 01 09:14:58 compute-0 sudo[95617]: pam_unix(sudo:session): session closed for user root
Dec 01 09:14:58 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 27 pg[5.0( empty local-lis/les=0/0 n=0 ec=27/27 lis/c=0/0 les/c/f=0/0/0 sis=27) [2] r=0 lpr=27 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:14:58 compute-0 sudo[95698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmxrnxxjcebjnrhgyktzhcanzqmydahi ; /usr/bin/python3'
Dec 01 09:14:58 compute-0 sudo[95698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:14:58 compute-0 python3[95700]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create cephfs.cephfs.meta  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:14:58 compute-0 podman[95701]: 2025-12-01 09:14:58.677193646 +0000 UTC m=+0.051344686 container create b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:14:58 compute-0 systemd[1]: Started libpod-conmon-b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8.scope.
Dec 01 09:14:58 compute-0 podman[95701]: 2025-12-01 09:14:58.656797844 +0000 UTC m=+0.030948904 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:14:58 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:14:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fc54dc345fd5ed408303be7b4539741a67940f7872e9f53e03cc1153b2ea16/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fc54dc345fd5ed408303be7b4539741a67940f7872e9f53e03cc1153b2ea16/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:14:58 compute-0 podman[95701]: 2025-12-01 09:14:58.773217803 +0000 UTC m=+0.147368873 container init b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:58 compute-0 podman[95701]: 2025-12-01 09:14:58.778495594 +0000 UTC m=+0.152646634 container start b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:14:58 compute-0 podman[95701]: 2025-12-01 09:14:58.782396283 +0000 UTC m=+0.156547353 container attach b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:14:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v71: 5 pgs: 3 active+clean, 2 unknown; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:14:59 compute-0 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:14:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e27 do_prune osdmap full prune enabled
Dec 01 09:14:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e28 e28: 3 total, 3 up, 3 in
Dec 01 09:14:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/616124501' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:14:59 compute-0 ceph-mon[75031]: osdmap e27: 3 total, 3 up, 3 in
Dec 01 09:14:59 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e28: 3 total, 3 up, 3 in
Dec 01 09:14:59 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 28 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=0/0 les/c/f=0/0/0 sis=27) [2] r=0 lpr=27 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:14:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec 01 09:14:59 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:15:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e28 do_prune osdmap full prune enabled
Dec 01 09:15:00 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:15:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e29 e29: 3 total, 3 up, 3 in
Dec 01 09:15:00 compute-0 ecstatic_jepsen[95716]: pool 'cephfs.cephfs.meta' created
Dec 01 09:15:00 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e29: 3 total, 3 up, 3 in
Dec 01 09:15:00 compute-0 ceph-mon[75031]: pgmap v71: 5 pgs: 3 active+clean, 2 unknown; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:00 compute-0 ceph-mon[75031]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:15:00 compute-0 ceph-mon[75031]: osdmap e28: 3 total, 3 up, 3 in
Dec 01 09:15:00 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:15:00 compute-0 systemd[1]: libpod-b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8.scope: Deactivated successfully.
Dec 01 09:15:00 compute-0 podman[95701]: 2025-12-01 09:15:00.30059221 +0000 UTC m=+1.674743250 container died b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:15:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-85fc54dc345fd5ed408303be7b4539741a67940f7872e9f53e03cc1153b2ea16-merged.mount: Deactivated successfully.
Dec 01 09:15:00 compute-0 podman[95701]: 2025-12-01 09:15:00.38293246 +0000 UTC m=+1.757083500 container remove b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8 (image=quay.io/ceph/ceph:v18, name=ecstatic_jepsen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:00 compute-0 systemd[1]: libpod-conmon-b942687a484a37bf40882c3519170f741be319b9f06b20d11bbf2c0f85a7c2e8.scope: Deactivated successfully.
Dec 01 09:15:00 compute-0 sudo[95698]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:00 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 29 pg[6.0( empty local-lis/les=0/0 n=0 ec=29/29 lis/c=0/0 les/c/f=0/0/0 sis=29) [0] r=0 lpr=29 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:00 compute-0 sudo[95780]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxbnnitqmjwxgpettajtkaxgjjvwyefg ; /usr/bin/python3'
Dec 01 09:15:00 compute-0 sudo[95780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:00 compute-0 python3[95782]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create cephfs.cephfs.data  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:00 compute-0 podman[95783]: 2025-12-01 09:15:00.767572454 +0000 UTC m=+0.021754904 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:00 compute-0 podman[95783]: 2025-12-01 09:15:00.876594838 +0000 UTC m=+0.130777308 container create e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:00 compute-0 systemd[1]: Started libpod-conmon-e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740.scope.
Dec 01 09:15:00 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0a8ec137484ab4f3843ac06a7f950734647c9e87c6b3a3b1477e6e8ca744698/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0a8ec137484ab4f3843ac06a7f950734647c9e87c6b3a3b1477e6e8ca744698/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:00 compute-0 podman[95783]: 2025-12-01 09:15:00.956505323 +0000 UTC m=+0.210687773 container init e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:15:00 compute-0 podman[95783]: 2025-12-01 09:15:00.962405513 +0000 UTC m=+0.216587943 container start e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:15:00 compute-0 podman[95783]: 2025-12-01 09:15:00.965924971 +0000 UTC m=+0.220107421 container attach e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:15:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v74: 6 pgs: 1 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e29 do_prune osdmap full prune enabled
Dec 01 09:15:01 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2536440165' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:15:01 compute-0 ceph-mon[75031]: osdmap e29: 3 total, 3 up, 3 in
Dec 01 09:15:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e30 e30: 3 total, 3 up, 3 in
Dec 01 09:15:01 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e30: 3 total, 3 up, 3 in
Dec 01 09:15:01 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 30 pg[6.0( empty local-lis/les=29/30 n=0 ec=29/29 lis/c=0/0 les/c/f=0/0/0 sis=29) [0] r=0 lpr=29 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0) v1
Dec 01 09:15:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:15:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e30 do_prune osdmap full prune enabled
Dec 01 09:15:02 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:15:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e31 e31: 3 total, 3 up, 3 in
Dec 01 09:15:02 compute-0 elated_sanderson[95799]: pool 'cephfs.cephfs.data' created
Dec 01 09:15:02 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e31: 3 total, 3 up, 3 in
Dec 01 09:15:02 compute-0 ceph-mon[75031]: pgmap v74: 6 pgs: 1 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:02 compute-0 ceph-mon[75031]: osdmap e30: 3 total, 3 up, 3 in
Dec 01 09:15:02 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Dec 01 09:15:02 compute-0 systemd[1]: libpod-e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740.scope: Deactivated successfully.
Dec 01 09:15:02 compute-0 podman[95783]: 2025-12-01 09:15:02.33555693 +0000 UTC m=+1.589739380 container died e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-b0a8ec137484ab4f3843ac06a7f950734647c9e87c6b3a3b1477e6e8ca744698-merged.mount: Deactivated successfully.
Dec 01 09:15:02 compute-0 podman[95783]: 2025-12-01 09:15:02.376456056 +0000 UTC m=+1.630638486 container remove e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740 (image=quay.io/ceph/ceph:v18, name=elated_sanderson, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:02 compute-0 systemd[1]: libpod-conmon-e320005a9af5df5858ea1f45fe752eb0eaf842a55592d4f3848ca476c4b8b740.scope: Deactivated successfully.
Dec 01 09:15:02 compute-0 sudo[95780]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:02 compute-0 sudo[95862]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jncgvudthwfnifsxquhydwaulweyrywx ; /usr/bin/python3'
Dec 01 09:15:02 compute-0 sudo[95862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:02 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 31 pg[7.0( empty local-lis/les=0/0 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [1] r=0 lpr=31 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:02 compute-0 python3[95864]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable vms rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:02 compute-0 podman[95865]: 2025-12-01 09:15:02.772428786 +0000 UTC m=+0.044841257 container create de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:15:02 compute-0 systemd[1]: Started libpod-conmon-de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7.scope.
Dec 01 09:15:02 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5487e1af34f86b0f3021e2871e9de50af06f5dbff044f7c05cda784cc1ae6630/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5487e1af34f86b0f3021e2871e9de50af06f5dbff044f7c05cda784cc1ae6630/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:02 compute-0 podman[95865]: 2025-12-01 09:15:02.752590082 +0000 UTC m=+0.025002573 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:02 compute-0 podman[95865]: 2025-12-01 09:15:02.853856239 +0000 UTC m=+0.126268740 container init de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:15:02 compute-0 podman[95865]: 2025-12-01 09:15:02.860885843 +0000 UTC m=+0.133298334 container start de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:02 compute-0 podman[95865]: 2025-12-01 09:15:02.864909525 +0000 UTC m=+0.137322057 container attach de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:15:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v77: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e31 do_prune osdmap full prune enabled
Dec 01 09:15:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e32 e32: 3 total, 3 up, 3 in
Dec 01 09:15:03 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1765389376' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 01 09:15:03 compute-0 ceph-mon[75031]: osdmap e31: 3 total, 3 up, 3 in
Dec 01 09:15:03 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e32: 3 total, 3 up, 3 in
Dec 01 09:15:03 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 32 pg[7.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [1] r=0 lpr=31 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"} v 0) v1
Dec 01 09:15:03 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Dec 01 09:15:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e32 do_prune osdmap full prune enabled
Dec 01 09:15:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Dec 01 09:15:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e33 e33: 3 total, 3 up, 3 in
Dec 01 09:15:04 compute-0 festive_mestorf[95880]: enabled application 'rbd' on pool 'vms'
Dec 01 09:15:04 compute-0 systemd[1]: libpod-de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7.scope: Deactivated successfully.
Dec 01 09:15:04 compute-0 podman[95865]: 2025-12-01 09:15:04.558888922 +0000 UTC m=+1.831301403 container died de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:04 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e33: 3 total, 3 up, 3 in
Dec 01 09:15:04 compute-0 ceph-mon[75031]: pgmap v77: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:04 compute-0 ceph-mon[75031]: osdmap e32: 3 total, 3 up, 3 in
Dec 01 09:15:04 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Dec 01 09:15:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-5487e1af34f86b0f3021e2871e9de50af06f5dbff044f7c05cda784cc1ae6630-merged.mount: Deactivated successfully.
Dec 01 09:15:04 compute-0 podman[95865]: 2025-12-01 09:15:04.956693005 +0000 UTC m=+2.229105486 container remove de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7 (image=quay.io/ceph/ceph:v18, name=festive_mestorf, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True)
Dec 01 09:15:04 compute-0 sudo[95862]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:05 compute-0 systemd[1]: libpod-conmon-de2bbc03eaf1020d679bf06e5ba11ef29ed59189ada8d044bfce15b60f6154a7.scope: Deactivated successfully.
Dec 01 09:15:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v80: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:05 compute-0 sudo[95941]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtzxxbrplzujhbtweoxawvfnaicjrurc ; /usr/bin/python3'
Dec 01 09:15:05 compute-0 sudo[95941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:05 compute-0 python3[95943]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable volumes rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:05 compute-0 podman[95944]: 2025-12-01 09:15:05.298333786 +0000 UTC m=+0.041530089 container create eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:05 compute-0 systemd[1]: Started libpod-conmon-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope.
Dec 01 09:15:05 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b06f4489a330341eb79e1af60872a39166144a92190e09b524d2833a5a074c23/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b06f4489a330341eb79e1af60872a39166144a92190e09b524d2833a5a074c23/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:05 compute-0 podman[95944]: 2025-12-01 09:15:05.28143105 +0000 UTC m=+0.024627363 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:05 compute-0 podman[95944]: 2025-12-01 09:15:05.391987348 +0000 UTC m=+0.135183671 container init eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:15:05 compute-0 podman[95944]: 2025-12-01 09:15:05.397208534 +0000 UTC m=+0.140404837 container start eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:15:05 compute-0 podman[95944]: 2025-12-01 09:15:05.400514899 +0000 UTC m=+0.143711222 container attach eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:15:05 compute-0 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:15:05 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1407074296' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Dec 01 09:15:05 compute-0 ceph-mon[75031]: osdmap e33: 3 total, 3 up, 3 in
Dec 01 09:15:06 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"} v 0) v1
Dec 01 09:15:06 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Dec 01 09:15:06 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e33 do_prune osdmap full prune enabled
Dec 01 09:15:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v81: 7 pgs: 1 unknown, 6 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:07 compute-0 ceph-mon[75031]: pgmap v80: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:07 compute-0 ceph-mon[75031]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:15:07 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Dec 01 09:15:07 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec 01 09:15:07 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e34 e34: 3 total, 3 up, 3 in
Dec 01 09:15:07 compute-0 keen_lichterman[95960]: enabled application 'rbd' on pool 'volumes'
Dec 01 09:15:07 compute-0 systemd[1]: libpod-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope: Deactivated successfully.
Dec 01 09:15:07 compute-0 conmon[95960]: conmon eb314f53a0fff35d5fe7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope/container/memory.events
Dec 01 09:15:07 compute-0 podman[95944]: 2025-12-01 09:15:07.214478289 +0000 UTC m=+1.957674592 container died eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:15:07 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e34: 3 total, 3 up, 3 in
Dec 01 09:15:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-b06f4489a330341eb79e1af60872a39166144a92190e09b524d2833a5a074c23-merged.mount: Deactivated successfully.
Dec 01 09:15:07 compute-0 podman[95944]: 2025-12-01 09:15:07.481497091 +0000 UTC m=+2.224693394 container remove eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289 (image=quay.io/ceph/ceph:v18, name=keen_lichterman, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:15:07 compute-0 sudo[95941]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:07 compute-0 systemd[1]: libpod-conmon-eb314f53a0fff35d5fe77a34e9be1a8b241b4cfdb21f91372da4d593bf6d9289.scope: Deactivated successfully.
Dec 01 09:15:07 compute-0 sudo[96020]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idjrbuvowclkmzlgffcmaxesfjooemag ; /usr/bin/python3'
Dec 01 09:15:07 compute-0 sudo[96020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:07 compute-0 python3[96022]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable backups rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:07 compute-0 podman[96023]: 2025-12-01 09:15:07.851918795 +0000 UTC m=+0.048311344 container create 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:15:07 compute-0 systemd[1]: Started libpod-conmon-31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9.scope.
Dec 01 09:15:07 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8528b24c36d5b81d162223350a8f63a8b0ec6e2d360df4ce267a3c19c93a49d6/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8528b24c36d5b81d162223350a8f63a8b0ec6e2d360df4ce267a3c19c93a49d6/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:07 compute-0 podman[96023]: 2025-12-01 09:15:07.924576331 +0000 UTC m=+0.120968870 container init 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec 01 09:15:07 compute-0 podman[96023]: 2025-12-01 09:15:07.831449205 +0000 UTC m=+0.027841744 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:07 compute-0 podman[96023]: 2025-12-01 09:15:07.931668276 +0000 UTC m=+0.128060795 container start 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:15:07 compute-0 podman[96023]: 2025-12-01 09:15:07.942241131 +0000 UTC m=+0.138633650 container attach 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:15:07 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:08 compute-0 ceph-mon[75031]: pgmap v81: 7 pgs: 1 unknown, 6 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:08 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4284969463' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec 01 09:15:08 compute-0 ceph-mon[75031]: osdmap e34: 3 total, 3 up, 3 in
Dec 01 09:15:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"} v 0) v1
Dec 01 09:15:08 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Dec 01 09:15:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v83: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e34 do_prune osdmap full prune enabled
Dec 01 09:15:09 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Dec 01 09:15:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec 01 09:15:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e35 e35: 3 total, 3 up, 3 in
Dec 01 09:15:09 compute-0 heuristic_lehmann[96038]: enabled application 'rbd' on pool 'backups'
Dec 01 09:15:09 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e35: 3 total, 3 up, 3 in
Dec 01 09:15:09 compute-0 systemd[1]: libpod-31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9.scope: Deactivated successfully.
Dec 01 09:15:09 compute-0 podman[96063]: 2025-12-01 09:15:09.262542566 +0000 UTC m=+0.023409594 container died 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:15:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-8528b24c36d5b81d162223350a8f63a8b0ec6e2d360df4ce267a3c19c93a49d6-merged.mount: Deactivated successfully.
Dec 01 09:15:09 compute-0 podman[96063]: 2025-12-01 09:15:09.302352699 +0000 UTC m=+0.063219717 container remove 31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9 (image=quay.io/ceph/ceph:v18, name=heuristic_lehmann, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:09 compute-0 systemd[1]: libpod-conmon-31c26afce4c629513a030b74ef86e372493b81dbed02cfdc2c71799b4b1ae7d9.scope: Deactivated successfully.
Dec 01 09:15:09 compute-0 sudo[96020]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:09 compute-0 sudo[96101]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dovfjulyafwzzsmfehxevsdyscofarrm ; /usr/bin/python3'
Dec 01 09:15:09 compute-0 sudo[96101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:09 compute-0 python3[96103]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable images rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:09 compute-0 podman[96104]: 2025-12-01 09:15:09.656078024 +0000 UTC m=+0.045444573 container create b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:15:09 compute-0 systemd[1]: Started libpod-conmon-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope.
Dec 01 09:15:09 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e2a33a845c37db36918105ca154bf9039932d469b00fbbf11c524feafcfd0a6/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e2a33a845c37db36918105ca154bf9039932d469b00fbbf11c524feafcfd0a6/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:09 compute-0 podman[96104]: 2025-12-01 09:15:09.632255678 +0000 UTC m=+0.021622207 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:09 compute-0 podman[96104]: 2025-12-01 09:15:09.73662464 +0000 UTC m=+0.125991169 container init b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:09 compute-0 podman[96104]: 2025-12-01 09:15:09.741658649 +0000 UTC m=+0.131025148 container start b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:15:09 compute-0 podman[96104]: 2025-12-01 09:15:09.745380637 +0000 UTC m=+0.134747146 container attach b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:10 compute-0 ceph-mon[75031]: pgmap v83: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:10 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2566742220' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec 01 09:15:10 compute-0 ceph-mon[75031]: osdmap e35: 3 total, 3 up, 3 in
Dec 01 09:15:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "images", "app": "rbd"} v 0) v1
Dec 01 09:15:10 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Dec 01 09:15:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v85: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e35 do_prune osdmap full prune enabled
Dec 01 09:15:11 compute-0 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:15:11 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Dec 01 09:15:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec 01 09:15:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e36 e36: 3 total, 3 up, 3 in
Dec 01 09:15:11 compute-0 eager_engelbart[96119]: enabled application 'rbd' on pool 'images'
Dec 01 09:15:11 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e36: 3 total, 3 up, 3 in
Dec 01 09:15:11 compute-0 systemd[1]: libpod-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope: Deactivated successfully.
Dec 01 09:15:11 compute-0 conmon[96119]: conmon b6ba5c3d9be627014bef <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope/container/memory.events
Dec 01 09:15:11 compute-0 podman[96104]: 2025-12-01 09:15:11.256255839 +0000 UTC m=+1.645622348 container died b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:15:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-8e2a33a845c37db36918105ca154bf9039932d469b00fbbf11c524feafcfd0a6-merged.mount: Deactivated successfully.
Dec 01 09:15:11 compute-0 podman[96104]: 2025-12-01 09:15:11.302917389 +0000 UTC m=+1.692283888 container remove b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d (image=quay.io/ceph/ceph:v18, name=eager_engelbart, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:11 compute-0 systemd[1]: libpod-conmon-b6ba5c3d9be627014bef53b387a8b81c403e9655d82bbe7bf9c197fd703de52d.scope: Deactivated successfully.
Dec 01 09:15:11 compute-0 sudo[96101]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:11 compute-0 sudo[96181]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twilvhudxhvztbpabnivpedfqnbhytnm ; /usr/bin/python3'
Dec 01 09:15:11 compute-0 sudo[96181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:11 compute-0 python3[96183]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable cephfs.cephfs.meta cephfs _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:11 compute-0 podman[96184]: 2025-12-01 09:15:11.619812325 +0000 UTC m=+0.041562970 container create d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec 01 09:15:11 compute-0 systemd[1]: Started libpod-conmon-d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1.scope.
Dec 01 09:15:11 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f3934b2808e71c83464f81ce06020197d8739b5ad35fa4ac7282c8a8421f7a3/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f3934b2808e71c83464f81ce06020197d8739b5ad35fa4ac7282c8a8421f7a3/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:11 compute-0 podman[96184]: 2025-12-01 09:15:11.694685881 +0000 UTC m=+0.116436526 container init d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:11 compute-0 podman[96184]: 2025-12-01 09:15:11.602716603 +0000 UTC m=+0.024467258 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:11 compute-0 podman[96184]: 2025-12-01 09:15:11.70125387 +0000 UTC m=+0.123004515 container start d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec 01 09:15:11 compute-0 podman[96184]: 2025-12-01 09:15:11.704446941 +0000 UTC m=+0.126197586 container attach d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:15:12 compute-0 ceph-mon[75031]: pgmap v85: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:12 compute-0 ceph-mon[75031]: Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:15:12 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3179720452' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec 01 09:15:12 compute-0 ceph-mon[75031]: osdmap e36: 3 total, 3 up, 3 in
Dec 01 09:15:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"} v 0) v1
Dec 01 09:15:12 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec 01 09:15:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:15:12
Dec 01 09:15:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:15:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:15:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['volumes', 'cephfs.cephfs.data', '.mgr', 'vms', 'backups', 'images', 'cephfs.cephfs.meta']
Dec 01 09:15:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v87: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 01 09:15:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"} v 0) v1
Dec 01 09:15:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:15:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e36 do_prune osdmap full prune enabled
Dec 01 09:15:13 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec 01 09:15:13 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec 01 09:15:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e37 e37: 3 total, 3 up, 3 in
Dec 01 09:15:13 compute-0 clever_snyder[96199]: enabled application 'cephfs' on pool 'cephfs.cephfs.meta'
Dec 01 09:15:13 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e37: 3 total, 3 up, 3 in
Dec 01 09:15:13 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev 266da851-2f25-4ebc-a75f-f2e1e54bce3c (PG autoscaler increasing pool 2 PGs from 1 to 32)
Dec 01 09:15:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"} v 0) v1
Dec 01 09:15:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:13 compute-0 systemd[1]: libpod-d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1.scope: Deactivated successfully.
Dec 01 09:15:13 compute-0 podman[96184]: 2025-12-01 09:15:13.263798872 +0000 UTC m=+1.685549517 container died d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-7f3934b2808e71c83464f81ce06020197d8739b5ad35fa4ac7282c8a8421f7a3-merged.mount: Deactivated successfully.
Dec 01 09:15:13 compute-0 podman[96184]: 2025-12-01 09:15:13.313389616 +0000 UTC m=+1.735140261 container remove d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1 (image=quay.io/ceph/ceph:v18, name=clever_snyder, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:13 compute-0 systemd[1]: libpod-conmon-d922f5f30706ffbfe3314ef25fc7147d2e6304b2c9c756068ccfaf86e34ec7d1.scope: Deactivated successfully.
Dec 01 09:15:13 compute-0 sudo[96181]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:13 compute-0 sudo[96260]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adptmvenqzhsenehvzdkqptvujtrxbtr ; /usr/bin/python3'
Dec 01 09:15:13 compute-0 sudo[96260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:13 compute-0 python3[96262]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable cephfs.cephfs.data cephfs _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:13 compute-0 podman[96263]: 2025-12-01 09:15:13.704241799 +0000 UTC m=+0.063753014 container create 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:13 compute-0 systemd[1]: Started libpod-conmon-7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042.scope.
Dec 01 09:15:13 compute-0 podman[96263]: 2025-12-01 09:15:13.669776075 +0000 UTC m=+0.029287380 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:13 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35b9ad00e67a5cce58f6dd547bc22c9c514b4722e618803ab2663e39a68b02e0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35b9ad00e67a5cce58f6dd547bc22c9c514b4722e618803ab2663e39a68b02e0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:13 compute-0 podman[96263]: 2025-12-01 09:15:13.781097707 +0000 UTC m=+0.140609002 container init 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:13 compute-0 podman[96263]: 2025-12-01 09:15:13.790880868 +0000 UTC m=+0.150392113 container start 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:13 compute-0 podman[96263]: 2025-12-01 09:15:13.797162057 +0000 UTC m=+0.156673352 container attach 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:15:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e37 do_prune osdmap full prune enabled
Dec 01 09:15:14 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e38 e38: 3 total, 3 up, 3 in
Dec 01 09:15:14 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e38: 3 total, 3 up, 3 in
Dec 01 09:15:14 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev 52c11968-3d0c-400a-9e65-b1c6a27534f8 (PG autoscaler increasing pool 3 PGs from 1 to 32)
Dec 01 09:15:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"} v 0) v1
Dec 01 09:15:14 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:14 compute-0 ceph-mon[75031]: pgmap v87: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:14 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/216207811' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec 01 09:15:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:14 compute-0 ceph-mon[75031]: osdmap e37: 3 total, 3 up, 3 in
Dec 01 09:15:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"} v 0) v1
Dec 01 09:15:14 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec 01 09:15:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v90: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e38 do_prune osdmap full prune enabled
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e39 e39: 3 total, 3 up, 3 in
Dec 01 09:15:15 compute-0 serene_easley[96278]: enabled application 'cephfs' on pool 'cephfs.cephfs.data'
Dec 01 09:15:15 compute-0 systemd[1]: libpod-7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042.scope: Deactivated successfully.
Dec 01 09:15:15 compute-0 podman[96263]: 2025-12-01 09:15:15.346238721 +0000 UTC m=+1.705749936 container died 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e39: 3 total, 3 up, 3 in
Dec 01 09:15:15 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev 01d93ced-24a8-46f9-aecc-b9550ca5544f (PG autoscaler increasing pool 4 PGs from 1 to 32)
Dec 01 09:15:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"} v 0) v1
Dec 01 09:15:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:15 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 39 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39 pruub=11.517323494s) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active pruub 71.754142761s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:15 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 39 pg[3.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39 pruub=11.517323494s) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown pruub 71.754142761s@ mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:15 compute-0 ceph-mon[75031]: osdmap e38: 3 total, 3 up, 3 in
Dec 01 09:15:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:15 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec 01 09:15:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:15 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-35b9ad00e67a5cce58f6dd547bc22c9c514b4722e618803ab2663e39a68b02e0-merged.mount: Deactivated successfully.
Dec 01 09:15:15 compute-0 podman[96263]: 2025-12-01 09:15:15.979755825 +0000 UTC m=+2.339267030 container remove 7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042 (image=quay.io/ceph/ceph:v18, name=serene_easley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:15:16 compute-0 sudo[96260]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:16 compute-0 systemd[1]: libpod-conmon-7a5134e33a373cf1a8a266c181e8d0e8dc67ea0b34bc95f32041399cd9340042.scope: Deactivated successfully.
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 39 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=39 pruub=8.982954979s) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active pruub 61.020610809s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 39 pg[2.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=39 pruub=8.982954979s) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown pruub 61.020610809s@ mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e39 do_prune osdmap full prune enabled
Dec 01 09:15:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e40 e40: 3 total, 3 up, 3 in
Dec 01 09:15:16 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e40: 3 total, 3 up, 3 in
Dec 01 09:15:16 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev 5c9d7e9c-1f04-4651-aec0-3bbe4e7daa02 (PG autoscaler increasing pool 5 PGs from 1 to 32)
Dec 01 09:15:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"} v 0) v1
Dec 01 09:15:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1c( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1d( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.a( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1f( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1e( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1b( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1f( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.9( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.8( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1e( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.9( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.8( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.7( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.6( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.5( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.4( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.3( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.6( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.7( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.3( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.e( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.2( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.11( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.14( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.16( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.17( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.19( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.4( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.2( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.b( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.c( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.d( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.e( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.f( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.10( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.11( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.12( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.13( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.14( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.15( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.16( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.18( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.17( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.19( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1a( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1a( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=21/22 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1e( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.0( empty local-lis/les=39/40 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.e( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.14( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 40 pg[2.1a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=21/21 les/c/f=22/22/0 sis=39) [2] r=0 lpr=39 pi=[21,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.0( empty local-lis/les=39/40 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.10( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.14( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.19( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 40 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:16 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Dec 01 09:15:16 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Dec 01 09:15:16 compute-0 ceph-mon[75031]: pgmap v90: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:16 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3525570064' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec 01 09:15:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:16 compute-0 ceph-mon[75031]: osdmap e39: 3 total, 3 up, 3 in
Dec 01 09:15:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:16 compute-0 ceph-mon[75031]: osdmap e40: 3 total, 3 up, 3 in
Dec 01 09:15:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v93: 69 pgs: 62 unknown, 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e40 do_prune osdmap full prune enabled
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e41 e41: 3 total, 3 up, 3 in
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e41: 3 total, 3 up, 3 in
Dec 01 09:15:17 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev 4e48849a-3329-4b0d-93ed-40d2131a9cce (PG autoscaler increasing pool 6 PGs from 1 to 32)
Dec 01 09:15:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"} v 0) v1
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:17 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=13.927580833s) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 67.123153687s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:17 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=27/28 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=13.927580833s) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown pruub 67.123153687s@ mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:17 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Dec 01 09:15:17 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Dec 01 09:15:17 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec 01 09:15:17 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec 01 09:15:17 compute-0 python3[96392]: ansible-ansible.legacy.stat Invoked with path=/tmp/ceph_mds.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec 01 09:15:17 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec 01 09:15:17 compute-0 ceph-mon[75031]: 2.1 scrub starts
Dec 01 09:15:17 compute-0 ceph-mon[75031]: 2.1 scrub ok
Dec 01 09:15:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:17 compute-0 ceph-mon[75031]: osdmap e41: 3 total, 3 up, 3 in
Dec 01 09:15:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:15:17 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 41 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=41 pruub=11.467036247s) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active pruub 78.851821899s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:17 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 41 pg[4.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=41 pruub=11.467036247s) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown pruub 78.851821899s@ mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:18 compute-0 python3[96463]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580517.4108396-36519-113963609720530/source dest=/tmp/ceph_mds.yml mode=0644 force=True follow=False _original_basename=ceph_mds.yml.j2 checksum=e359e26d9e42bc107a0de03375144cf8590b6f68 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress WARNING root] Starting Global Recovery Event,93 pgs not in active + clean state
Dec 01 09:15:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e41 do_prune osdmap full prune enabled
Dec 01 09:15:18 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e42 e42: 3 total, 3 up, 3 in
Dec 01 09:15:18 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e42: 3 total, 3 up, 3 in
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1e( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev 72549a5c-bd64-44e7-a176-0b67e4907218 (PG autoscaler increasing pool 7 PGs from 1 to 32)
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1f( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1d( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.b( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.6( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.19( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.3( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev 266da851-2f25-4ebc-a75f-f2e1e54bce3c (PG autoscaler increasing pool 2 PGs from 1 to 32)
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 266da851-2f25-4ebc-a75f-f2e1e54bce3c (PG autoscaler increasing pool 2 PGs from 1 to 32) in 5 seconds
Dec 01 09:15:18 compute-0 sudo[96511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfwdhaffpzwqjypezjlgznyqelkedegn ; /usr/bin/python3'
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev 52c11968-3d0c-400a-9e65-b1c6a27534f8 (PG autoscaler increasing pool 3 PGs from 1 to 32)
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.c( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 52c11968-3d0c-400a-9e65-b1c6a27534f8 (PG autoscaler increasing pool 3 PGs from 1 to 32) in 4 seconds
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev 01d93ced-24a8-46f9-aecc-b9550ca5544f (PG autoscaler increasing pool 4 PGs from 1 to 32)
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 01d93ced-24a8-46f9-aecc-b9550ca5544f (PG autoscaler increasing pool 4 PGs from 1 to 32) in 3 seconds
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev 5c9d7e9c-1f04-4651-aec0-3bbe4e7daa02 (PG autoscaler increasing pool 5 PGs from 1 to 32)
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 5c9d7e9c-1f04-4651-aec0-3bbe4e7daa02 (PG autoscaler increasing pool 5 PGs from 1 to 32) in 2 seconds
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev 4e48849a-3329-4b0d-93ed-40d2131a9cce (PG autoscaler increasing pool 6 PGs from 1 to 32)
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 4e48849a-3329-4b0d-93ed-40d2131a9cce (PG autoscaler increasing pool 6 PGs from 1 to 32) in 1 seconds
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev 72549a5c-bd64-44e7-a176-0b67e4907218 (PG autoscaler increasing pool 7 PGs from 1 to 32)
Dec 01 09:15:18 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 72549a5c-bd64-44e7-a176-0b67e4907218 (PG autoscaler increasing pool 7 PGs from 1 to 32) in 0 seconds
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.15( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.17( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.16( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=25/26 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 sudo[96511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.8( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.b( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.d( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.6( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.19( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.3( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.0( empty local-lis/les=41/42 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.15( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.16( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.17( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=25/25 les/c/f=26/26/0 sis=41) [0] r=0 lpr=41 pi=[25,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.e( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=27/28 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.8( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.0( empty local-lis/les=41/42 n=0 ec=27/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.b( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=27/27 les/c/f=28/28/0 sis=41) [2] r=0 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:18 compute-0 python3[96513]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /tmp/ceph_mds.yml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   fs volume create cephfs '--placement=compute-0 '
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:18 compute-0 podman[96514]: 2025-12-01 09:15:18.577193515 +0000 UTC m=+0.056051009 container create cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True)
Dec 01 09:15:18 compute-0 systemd[1]: Started libpod-conmon-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope.
Dec 01 09:15:18 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:18 compute-0 podman[96514]: 2025-12-01 09:15:18.553986569 +0000 UTC m=+0.032844153 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:18 compute-0 podman[96514]: 2025-12-01 09:15:18.65202323 +0000 UTC m=+0.130880744 container init cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:15:18 compute-0 podman[96514]: 2025-12-01 09:15:18.658851027 +0000 UTC m=+0.137708521 container start cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:18 compute-0 podman[96514]: 2025-12-01 09:15:18.662076209 +0000 UTC m=+0.140933713 container attach cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:18 compute-0 ceph-mon[75031]: pgmap v93: 69 pgs: 62 unknown, 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:18 compute-0 ceph-mon[75031]: 2.2 scrub starts
Dec 01 09:15:18 compute-0 ceph-mon[75031]: 2.2 scrub ok
Dec 01 09:15:18 compute-0 ceph-mon[75031]: 3.1 scrub starts
Dec 01 09:15:18 compute-0 ceph-mon[75031]: 3.1 scrub ok
Dec 01 09:15:18 compute-0 ceph-mon[75031]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec 01 09:15:18 compute-0 ceph-mon[75031]: Cluster is now healthy
Dec 01 09:15:18 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:15:18 compute-0 ceph-mon[75031]: osdmap e42: 3 total, 3 up, 3 in
Dec 01 09:15:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v96: 131 pgs: 93 unknown, 38 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14242 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 ", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mgr[75324]: [volumes INFO volumes.module] Starting _cmd_fs_volume_create(name:cephfs, placement:compute-0 , prefix:fs volume create, target:['mon-mgr', '']) < ""
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"} v 0) v1
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"} v 0) v1
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"} v 0) v1
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e42 do_prune osdmap full prune enabled
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [WRN] : Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec 01 09:15:19 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0[75027]: 2025-12-01T09:15:19.275+0000 7ff93f5e2640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e2 new map
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e2 print_map
                                           e2
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:15:19.277019+0000
                                           modified        2025-12-01T09:15:19.277097+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                            
                                            
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e43 e43: 3 total, 3 up, 3 in
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e43: 3 total, 3 up, 3 in
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:0
Dec 01 09:15:19 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=29/30 n=0 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=43 pruub=14.023229599s) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active pruub 82.905815125s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:19 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service mds.cephfs spec with placement compute-0
Dec 01 09:15:19 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mds.cephfs spec with placement compute-0
Dec 01 09:15:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec 01 09:15:19 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=29/30 n=0 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=43 pruub=14.023229599s) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown pruub 82.905815125s@ mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:19 compute-0 ceph-mgr[75324]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_create(name:cephfs, placement:compute-0 , prefix:fs volume create, target:['mon-mgr', '']) < ""
Dec 01 09:15:19 compute-0 systemd[1]: libpod-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope: Deactivated successfully.
Dec 01 09:15:19 compute-0 conmon[96529]: conmon cabf666166aa2bf023f2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope/container/memory.events
Dec 01 09:15:19 compute-0 podman[96514]: 2025-12-01 09:15:19.327393431 +0000 UTC m=+0.806250925 container died cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-78e3c9e348c0ac71434cdf9e5c89072567ddb145ceefc8a34d331a64629e89d2-merged.mount: Deactivated successfully.
Dec 01 09:15:19 compute-0 systemd[76658]: Starting Mark boot as successful...
Dec 01 09:15:19 compute-0 systemd[76658]: Finished Mark boot as successful.
Dec 01 09:15:19 compute-0 sudo[96554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:19 compute-0 podman[96514]: 2025-12-01 09:15:19.374451594 +0000 UTC m=+0.853309088 container remove cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34 (image=quay.io/ceph/ceph:v18, name=nice_sinoussi, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:15:19 compute-0 sudo[96554]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:19 compute-0 sudo[96554]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:19 compute-0 systemd[1]: libpod-conmon-cabf666166aa2bf023f29c31c8271a9fbf643caf8f43f174bd2fb6e54a7d9e34.scope: Deactivated successfully.
Dec 01 09:15:19 compute-0 sudo[96511]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:19 compute-0 sudo[96592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:19 compute-0 sudo[96592]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:19 compute-0 sudo[96592]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:19 compute-0 sudo[96617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:19 compute-0 sudo[96617]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:19 compute-0 sudo[96617]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:19 compute-0 sudo[96687]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huemxzdjfzjjaltzmlvwczzlhiblvmkd ; /usr/bin/python3'
Dec 01 09:15:19 compute-0 sudo[96645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:15:19 compute-0 sudo[96645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:19 compute-0 sudo[96687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:19 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.2 deep-scrub starts
Dec 01 09:15:19 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.2 deep-scrub ok
Dec 01 09:15:19 compute-0 python3[96692]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /tmp/ceph_mds.yml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch apply --in-file /home/ceph_spec.yaml _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec 01 09:15:19 compute-0 ceph-mon[75031]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 01 09:15:19 compute-0 ceph-mon[75031]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:15:19 compute-0 ceph-mon[75031]: osdmap e43: 3 total, 3 up, 3 in
Dec 01 09:15:19 compute-0 ceph-mon[75031]: fsmap cephfs:0
Dec 01 09:15:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:19 compute-0 podman[96702]: 2025-12-01 09:15:19.777158293 +0000 UTC m=+0.041515529 container create f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:15:19 compute-0 systemd[1]: Started libpod-conmon-f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64.scope.
Dec 01 09:15:19 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:19 compute-0 podman[96702]: 2025-12-01 09:15:19.854034042 +0000 UTC m=+0.118391298 container init f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:15:19 compute-0 podman[96702]: 2025-12-01 09:15:19.760443872 +0000 UTC m=+0.024801148 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:19 compute-0 podman[96702]: 2025-12-01 09:15:19.863882114 +0000 UTC m=+0.128239360 container start f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:15:19 compute-0 podman[96702]: 2025-12-01 09:15:19.868509831 +0000 UTC m=+0.132867097 container attach f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:15:20 compute-0 podman[96784]: 2025-12-01 09:15:20.145957895 +0000 UTC m=+0.060561462 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:15:20 compute-0 podman[96784]: 2025-12-01 09:15:20.264123285 +0000 UTC m=+0.178726892 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e43 do_prune osdmap full prune enabled
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e44 e44: 3 total, 3 up, 3 in
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e44: 3 total, 3 up, 3 in
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=29/30 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.0( empty local-lis/les=43/44 n=0 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=29/29 les/c/f=30/30/0 sis=43) [0] r=0 lpr=43 pi=[29,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:20 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14244 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:15:20 compute-0 ceph-mgr[75324]: [cephadm INFO root] Saving service mds.cephfs spec with placement compute-0
Dec 01 09:15:20 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Saving service mds.cephfs spec with placement compute-0
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec 01 09:15:20 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Dec 01 09:15:20 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:20 compute-0 epic_kepler[96734]: Scheduled mds.cephfs update...
Dec 01 09:15:20 compute-0 systemd[1]: libpod-f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64.scope: Deactivated successfully.
Dec 01 09:15:20 compute-0 podman[96702]: 2025-12-01 09:15:20.760243558 +0000 UTC m=+1.024600804 container died f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec 01 09:15:20 compute-0 sudo[96645]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: pgmap v96: 131 pgs: 93 unknown, 38 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:20 compute-0 ceph-mon[75031]: from='client.14242 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 ", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:15:20 compute-0 ceph-mon[75031]: Saving service mds.cephfs spec with placement compute-0
Dec 01 09:15:20 compute-0 ceph-mon[75031]: 3.2 deep-scrub starts
Dec 01 09:15:20 compute-0 ceph-mon[75031]: 3.2 deep-scrub ok
Dec 01 09:15:20 compute-0 ceph-mon[75031]: osdmap e44: 3 total, 3 up, 3 in
Dec 01 09:15:20 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-32cca5f0e3b422a5e7501e00004c7f634a5cb05b3e828aa9e868943f0855073a-merged.mount: Deactivated successfully.
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:20 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev cb41a296-48fd-4efe-b232-6bd37d0d4845 does not exist
Dec 01 09:15:20 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 789f6ee6-c2dc-4926-8b19-6aa0ec081238 does not exist
Dec 01 09:15:20 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev dd9c7641-4a36-4186-88e3-e2c28d7194a7 does not exist
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:15:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:15:20 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:20 compute-0 podman[96702]: 2025-12-01 09:15:20.960612736 +0000 UTC m=+1.224969992 container remove f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64 (image=quay.io/ceph/ceph:v18, name=epic_kepler, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:15:20 compute-0 systemd[1]: libpod-conmon-f91447b735a4aaee2ab375a7a83bcb869fbef33baad6e1371bfd6438b6196e64.scope: Deactivated successfully.
Dec 01 09:15:21 compute-0 sudo[96687]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v99: 193 pgs: 1 peering, 62 unknown, 130 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:21 compute-0 sudo[96941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:21 compute-0 sudo[96941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:21 compute-0 sudo[96941]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:21 compute-0 sudo[96966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:21 compute-0 sudo[96966]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:21 compute-0 sudo[96966]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:21 compute-0 sudo[96991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:21 compute-0 sudo[96991]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:21 compute-0 sudo[96991]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:21 compute-0 sudo[97016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:15:21 compute-0 sudo[97016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 43 pg[7.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=43 pruub=13.941397667s) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active pruub 79.873344421s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=43 pruub=13.941397667s) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown pruub 79.873344421s@ mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.11( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.12( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.17( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.18( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.19( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1a( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.4( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.3( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.2( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.5( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.6( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.9( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.a( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.7( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.8( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.d( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.e( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.b( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.c( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.f( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.10( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.13( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.14( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.15( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.16( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1b( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1f( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1c( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1d( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 44 pg[7.1e( empty local-lis/les=31/32 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:21 compute-0 sudo[97140]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcnahdpbrfzxcayxoxsvsjnwsstnxfto ; /usr/bin/python3'
Dec 01 09:15:21 compute-0 sudo[97140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:21 compute-0 podman[97160]: 2025-12-01 09:15:21.646910533 +0000 UTC m=+0.073455512 container create 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:21 compute-0 python3[97147]: ansible-ansible.legacy.stat Invoked with path=/etc/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:15:21 compute-0 sudo[97140]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:21 compute-0 systemd[1]: Started libpod-conmon-0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50.scope.
Dec 01 09:15:21 compute-0 podman[97160]: 2025-12-01 09:15:21.61751335 +0000 UTC m=+0.044058379 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:21 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:21 compute-0 podman[97160]: 2025-12-01 09:15:21.757596935 +0000 UTC m=+0.184141884 container init 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:15:21 compute-0 podman[97160]: 2025-12-01 09:15:21.768933634 +0000 UTC m=+0.195478573 container start 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:21 compute-0 podman[97160]: 2025-12-01 09:15:21.772973332 +0000 UTC m=+0.199518381 container attach 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:15:21 compute-0 focused_mcclintock[97181]: 167 167
Dec 01 09:15:21 compute-0 systemd[1]: libpod-0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50.scope: Deactivated successfully.
Dec 01 09:15:21 compute-0 podman[97160]: 2025-12-01 09:15:21.774978536 +0000 UTC m=+0.201523475 container died 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:15:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-29fcaa556f8c12a0506b8bbb81bc7f54c5f9e526643734d5c91d358af0b975b4-merged.mount: Deactivated successfully.
Dec 01 09:15:21 compute-0 podman[97160]: 2025-12-01 09:15:21.81543008 +0000 UTC m=+0.241975029 container remove 0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_mcclintock, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:21 compute-0 systemd[1]: libpod-conmon-0ff11b873b6289e31483cc02c8a40d0db19757b85a533e4ddfbb14fd5e63ac50.scope: Deactivated successfully.
Dec 01 09:15:21 compute-0 sudo[97266]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emkktthnljaqwgeyzxtejrlxndpgfmho ; /usr/bin/python3'
Dec 01 09:15:21 compute-0 sudo[97266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='client.14244 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:15:21 compute-0 ceph-mon[75031]: Saving service mds.cephfs spec with placement compute-0
Dec 01 09:15:21 compute-0 ceph-mon[75031]: 3.3 scrub starts
Dec 01 09:15:21 compute-0 ceph-mon[75031]: 3.3 scrub ok
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:15:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e44 do_prune osdmap full prune enabled
Dec 01 09:15:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e45 e45: 3 total, 3 up, 3 in
Dec 01 09:15:21 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e45: 3 total, 3 up, 3 in
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.12( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.10( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1d( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.17( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.16( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.14( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.7( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.d( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.19( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.0( empty local-lis/les=43/45 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 45 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=31/31 les/c/f=32/32/0 sis=43) [1] r=0 lpr=43 pi=[31,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:21 compute-0 podman[97274]: 2025-12-01 09:15:21.991115495 +0000 UTC m=+0.057898119 container create 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec 01 09:15:22 compute-0 systemd[1]: Started libpod-conmon-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope.
Dec 01 09:15:22 compute-0 podman[97274]: 2025-12-01 09:15:21.957110815 +0000 UTC m=+0.023893489 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:22 compute-0 python3[97268]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580521.346623-36549-116230047858529/source dest=/etc/ceph/ceph.client.openstack.keyring mode=0644 force=True owner=167 group=167 follow=False _original_basename=ceph_key.j2 checksum=30c595aa84bea916cfc9cc906a8788f27659122a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:15:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:22 compute-0 podman[97274]: 2025-12-01 09:15:22.087777632 +0000 UTC m=+0.154560286 container init 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:15:22 compute-0 sudo[97266]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:22 compute-0 podman[97274]: 2025-12-01 09:15:22.097697977 +0000 UTC m=+0.164480601 container start 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:22 compute-0 podman[97274]: 2025-12-01 09:15:22.101526158 +0000 UTC m=+0.168308802 container attach 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:15:22 compute-0 sudo[97342]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uezvyrupurbccxqtshpuyzgophmppvcu ; /usr/bin/python3'
Dec 01 09:15:22 compute-0 sudo[97342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:22 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Dec 01 09:15:22 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Dec 01 09:15:22 compute-0 python3[97344]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   auth import -i /etc/ceph/ceph.client.openstack.keyring _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:22 compute-0 podman[97345]: 2025-12-01 09:15:22.61843059 +0000 UTC m=+0.026606595 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:22 compute-0 podman[97345]: 2025-12-01 09:15:22.821252176 +0000 UTC m=+0.229428181 container create dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:15:22 compute-0 systemd[1]: Started libpod-conmon-dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b.scope.
Dec 01 09:15:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d124a8e1798da8aca34e5353f3f6e9595804cce06e967f4a5109e74aaa046729/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d124a8e1798da8aca34e5353f3f6e9595804cce06e967f4a5109e74aaa046729/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:22 compute-0 podman[97345]: 2025-12-01 09:15:22.940925914 +0000 UTC m=+0.349101939 container init dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:15:22 compute-0 podman[97345]: 2025-12-01 09:15:22.948772073 +0000 UTC m=+0.356948078 container start dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:15:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:22 compute-0 podman[97345]: 2025-12-01 09:15:22.952498741 +0000 UTC m=+0.360674856 container attach dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:22 compute-0 ceph-mon[75031]: pgmap v99: 193 pgs: 1 peering, 62 unknown, 130 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:22 compute-0 ceph-mon[75031]: osdmap e45: 3 total, 3 up, 3 in
Dec 01 09:15:22 compute-0 ceph-mon[75031]: 4.1 scrub starts
Dec 01 09:15:22 compute-0 ceph-mon[75031]: 4.1 scrub ok
Dec 01 09:15:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v101: 193 pgs: 1 peering, 31 unknown, 161 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:23 compute-0 ceph-mgr[75324]: [progress INFO root] Writing back 9 completed events
Dec 01 09:15:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec 01 09:15:23 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:23 compute-0 determined_yalow[97290]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:15:23 compute-0 determined_yalow[97290]: --> relative data size: 1.0
Dec 01 09:15:23 compute-0 determined_yalow[97290]: --> All data devices are unavailable
Dec 01 09:15:23 compute-0 systemd[1]: libpod-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope: Deactivated successfully.
Dec 01 09:15:23 compute-0 podman[97274]: 2025-12-01 09:15:23.242128912 +0000 UTC m=+1.308911536 container died 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:15:23 compute-0 systemd[1]: libpod-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope: Consumed 1.053s CPU time.
Dec 01 09:15:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-75e4d34a1e513df5001d46cffe4aa4a368565cafde0d38f59cc8b838a67b07f7-merged.mount: Deactivated successfully.
Dec 01 09:15:23 compute-0 podman[97274]: 2025-12-01 09:15:23.341218456 +0000 UTC m=+1.408001080 container remove 68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_yalow, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:15:23 compute-0 systemd[1]: libpod-conmon-68c63b63aa80486b1290889df9bdd17c2583de857cb743134ddb3bc17231059d.scope: Deactivated successfully.
Dec 01 09:15:23 compute-0 sudo[97016]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:23 compute-0 sudo[97421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:23 compute-0 sudo[97421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:23 compute-0 sudo[97421]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:23 compute-0 sudo[97446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:23 compute-0 sudo[97446]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:23 compute-0 sudo[97446]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:23 compute-0 sudo[97471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:23 compute-0 sudo[97471]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:23 compute-0 sudo[97471]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth import"} v 0) v1
Dec 01 09:15:23 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec 01 09:15:23 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec 01 09:15:23 compute-0 sudo[97496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:15:23 compute-0 sudo[97496]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:23 compute-0 systemd[1]: libpod-dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b.scope: Deactivated successfully.
Dec 01 09:15:23 compute-0 podman[97345]: 2025-12-01 09:15:23.623243525 +0000 UTC m=+1.031419540 container died dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-d124a8e1798da8aca34e5353f3f6e9595804cce06e967f4a5109e74aaa046729-merged.mount: Deactivated successfully.
Dec 01 09:15:23 compute-0 podman[97345]: 2025-12-01 09:15:23.671143075 +0000 UTC m=+1.079319080 container remove dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b (image=quay.io/ceph/ceph:v18, name=focused_bell, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:23 compute-0 systemd[1]: libpod-conmon-dcdf11a0e654da104f349893d683c519912358b4a1098596851fb699e05b012b.scope: Deactivated successfully.
Dec 01 09:15:23 compute-0 sudo[97342]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:23 compute-0 podman[97576]: 2025-12-01 09:15:23.954732934 +0000 UTC m=+0.038722020 container create 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:15:23 compute-0 systemd[1]: Started libpod-conmon-6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43.scope.
Dec 01 09:15:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:24 compute-0 podman[97576]: 2025-12-01 09:15:24.026883633 +0000 UTC m=+0.110872719 container init 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:15:24 compute-0 podman[97576]: 2025-12-01 09:15:24.032645906 +0000 UTC m=+0.116634992 container start 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:24 compute-0 podman[97576]: 2025-12-01 09:15:23.937938591 +0000 UTC m=+0.021927697 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:24 compute-0 podman[97576]: 2025-12-01 09:15:24.035872699 +0000 UTC m=+0.119861815 container attach 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:24 compute-0 elegant_hermann[97592]: 167 167
Dec 01 09:15:24 compute-0 systemd[1]: libpod-6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43.scope: Deactivated successfully.
Dec 01 09:15:24 compute-0 podman[97576]: 2025-12-01 09:15:24.037279813 +0000 UTC m=+0.121268899 container died 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:15:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-a59471fc1f4b9b9e7a4c6c4c40b4394a030bf9f5ed40c7d8ed24d26263abcf46-merged.mount: Deactivated successfully.
Dec 01 09:15:24 compute-0 podman[97576]: 2025-12-01 09:15:24.073947837 +0000 UTC m=+0.157936923 container remove 6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_hermann, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:15:24 compute-0 systemd[1]: libpod-conmon-6b559be8aac9e26fd01ae011d9edf1039fe092d734246f272a4f7b162432da43.scope: Deactivated successfully.
Dec 01 09:15:24 compute-0 sudo[97639]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eardvzdyffjadafiwuglfmnqotrpbxjf ; /usr/bin/python3'
Dec 01 09:15:24 compute-0 sudo[97639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:24 compute-0 ceph-mon[75031]: pgmap v101: 193 pgs: 1 peering, 31 unknown, 161 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:24 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:24 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec 01 09:15:24 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2988375873' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec 01 09:15:24 compute-0 podman[97638]: 2025-12-01 09:15:24.218891906 +0000 UTC m=+0.044038098 container create 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:24 compute-0 systemd[1]: Started libpod-conmon-6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599.scope.
Dec 01 09:15:24 compute-0 podman[97638]: 2025-12-01 09:15:24.203463286 +0000 UTC m=+0.028609498 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:24 compute-0 python3[97647]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .monmap.num_mons _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:24 compute-0 podman[97638]: 2025-12-01 09:15:24.340695381 +0000 UTC m=+0.165841603 container init 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:15:24 compute-0 podman[97638]: 2025-12-01 09:15:24.351115652 +0000 UTC m=+0.176261844 container start 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:24 compute-0 podman[97638]: 2025-12-01 09:15:24.372006345 +0000 UTC m=+0.197152557 container attach 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:15:24 compute-0 podman[97662]: 2025-12-01 09:15:24.39833522 +0000 UTC m=+0.046425114 container create 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec 01 09:15:24 compute-0 systemd[1]: Started libpod-conmon-84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a.scope.
Dec 01 09:15:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55f1a9227a2d40fdbb98e5afc7638945fda74a55235b768026b37860b935e029/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55f1a9227a2d40fdbb98e5afc7638945fda74a55235b768026b37860b935e029/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:24 compute-0 podman[97662]: 2025-12-01 09:15:24.471680357 +0000 UTC m=+0.119770281 container init 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:24 compute-0 podman[97662]: 2025-12-01 09:15:24.379195183 +0000 UTC m=+0.027285097 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:24 compute-0 podman[97662]: 2025-12-01 09:15:24.477967597 +0000 UTC m=+0.126057491 container start 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:15:24 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.2 deep-scrub starts
Dec 01 09:15:24 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.2 deep-scrub ok
Dec 01 09:15:24 compute-0 podman[97662]: 2025-12-01 09:15:24.65988782 +0000 UTC m=+0.307977734 container attach 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v102: 193 pgs: 1 peering, 31 unknown, 161 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:25 compute-0 interesting_knuth[97658]: {
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:     "0": [
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:         {
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "devices": [
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "/dev/loop3"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             ],
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_name": "ceph_lv0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_size": "21470642176",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "name": "ceph_lv0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "tags": {
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.crush_device_class": "",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.encrypted": "0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osd_id": "0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.type": "block",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.vdo": "0"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             },
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "type": "block",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "vg_name": "ceph_vg0"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:         }
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:     ],
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:     "1": [
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:         {
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "devices": [
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "/dev/loop4"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             ],
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_name": "ceph_lv1",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_size": "21470642176",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "name": "ceph_lv1",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "tags": {
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.crush_device_class": "",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.encrypted": "0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osd_id": "1",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.type": "block",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.vdo": "0"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             },
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "type": "block",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "vg_name": "ceph_vg1"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:         }
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:     ],
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:     "2": [
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:         {
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "devices": [
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "/dev/loop5"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             ],
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_name": "ceph_lv2",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_size": "21470642176",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "name": "ceph_lv2",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "tags": {
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.crush_device_class": "",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.encrypted": "0",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osd_id": "2",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.type": "block",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:                 "ceph.vdo": "0"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             },
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "type": "block",
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:             "vg_name": "ceph_vg2"
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:         }
Dec 01 09:15:25 compute-0 interesting_knuth[97658]:     ]
Dec 01 09:15:25 compute-0 interesting_knuth[97658]: }
Dec 01 09:15:25 compute-0 systemd[1]: libpod-6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599.scope: Deactivated successfully.
Dec 01 09:15:25 compute-0 podman[97638]: 2025-12-01 09:15:25.195685091 +0000 UTC m=+1.020831283 container died 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec 01 09:15:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/480516295' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:15:25 compute-0 intelligent_kepler[97680]: 
Dec 01 09:15:25 compute-0 intelligent_kepler[97680]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_ERR","checks":{"MDS_ALL_DOWN":{"severity":"HEALTH_ERR","summary":{"message":"1 filesystem is offline","count":1},"muted":false},"MDS_UP_LESS_THAN_MAX":{"severity":"HEALTH_WARN","summary":{"message":"1 filesystem is online with fewer MDS than max_mds","count":1},"muted":false}},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":182,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":45,"num_osds":3,"num_up_osds":3,"osd_up_since":1764580474,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":161},{"state_name":"unknown","count":31},{"state_name":"peering","count":1}],"num_pgs":193,"num_pools":7,"num_objects":2,"data_bytes":459280,"bytes_used":84180992,"bytes_avail":64327745536,"bytes_total":64411926528,"unknown_pgs_ratio":0.1606217622756958,"inactive_pgs_ratio":0.005181347019970417},"fsmap":{"epoch":2,"id":1,"up":0,"in":0,"max":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":3,"modified":"2025-12-01T09:15:21.032921+0000","services":{"osd":{"daemons":{"summary":"","1":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}},"2":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}}}}}},"progress_events":{"6a1eadee-2fd7-4097-9f7f-4e2c6af1e403":{"message":"Global Recovery Event (0s)\n      [............................] ","progress":0,"add_to_ceph_s":true}}}
Dec 01 09:15:25 compute-0 ceph-mon[75031]: 4.2 deep-scrub starts
Dec 01 09:15:25 compute-0 ceph-mon[75031]: 4.2 deep-scrub ok
Dec 01 09:15:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-35e4821a2616c8bc260468275709e7d0c2311c8703514c6d389c30af2f0f55a6-merged.mount: Deactivated successfully.
Dec 01 09:15:25 compute-0 systemd[1]: libpod-84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a.scope: Deactivated successfully.
Dec 01 09:15:25 compute-0 podman[97662]: 2025-12-01 09:15:25.261158499 +0000 UTC m=+0.909248393 container died 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:25 compute-0 podman[97638]: 2025-12-01 09:15:25.357463104 +0000 UTC m=+1.182609336 container remove 6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_knuth, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:15:25 compute-0 systemd[1]: libpod-conmon-6195c5dadb0edf002e2439404b4dd4bba033d1c1dc8409ffe80ca9602bacb599.scope: Deactivated successfully.
Dec 01 09:15:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-55f1a9227a2d40fdbb98e5afc7638945fda74a55235b768026b37860b935e029-merged.mount: Deactivated successfully.
Dec 01 09:15:25 compute-0 sudo[97496]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:25 compute-0 podman[97662]: 2025-12-01 09:15:25.420401061 +0000 UTC m=+1.068490955 container remove 84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a (image=quay.io/ceph/ceph:v18, name=intelligent_kepler, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec 01 09:15:25 compute-0 systemd[1]: libpod-conmon-84c74c37079a79421a02a582a48bc3dd03c5ef0e1af1d63ce648dcd5fe8a1f0a.scope: Deactivated successfully.
Dec 01 09:15:25 compute-0 sudo[97639]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:25 compute-0 sudo[97734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:25 compute-0 sudo[97734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:25 compute-0 sudo[97734]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:25 compute-0 sudo[97759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:25 compute-0 sudo[97759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:25 compute-0 sudo[97759]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:25 compute-0 sudo[97814]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqnonwuitdoqhcvklpkbgdovbopdvzvr ; /usr/bin/python3'
Dec 01 09:15:25 compute-0 sudo[97814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:25 compute-0 sudo[97801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:25 compute-0 sudo[97801]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:25 compute-0 sudo[97801]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:25 compute-0 sudo[97835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:15:25 compute-0 sudo[97835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:25 compute-0 python3[97832]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   mon dump --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:25 compute-0 podman[97860]: 2025-12-01 09:15:25.80989538 +0000 UTC m=+0.046677802 container create 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:25 compute-0 systemd[1]: Started libpod-conmon-3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109.scope.
Dec 01 09:15:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86c0ac1fae0c8f4087fdf12a1783d13fdbbedbc5ec3da1c971df73ea3cec59d3/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86c0ac1fae0c8f4087fdf12a1783d13fdbbedbc5ec3da1c971df73ea3cec59d3/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:25 compute-0 podman[97860]: 2025-12-01 09:15:25.791359672 +0000 UTC m=+0.028142134 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:25 compute-0 podman[97860]: 2025-12-01 09:15:25.888134723 +0000 UTC m=+0.124917165 container init 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:25 compute-0 podman[97860]: 2025-12-01 09:15:25.894847426 +0000 UTC m=+0.131629848 container start 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:25 compute-0 podman[97860]: 2025-12-01 09:15:25.898465951 +0000 UTC m=+0.135248393 container attach 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:26 compute-0 podman[97914]: 2025-12-01 09:15:26.016377872 +0000 UTC m=+0.039784863 container create d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:15:26 compute-0 systemd[1]: Started libpod-conmon-d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9.scope.
Dec 01 09:15:26 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:26 compute-0 podman[97914]: 2025-12-01 09:15:26.079429883 +0000 UTC m=+0.102836884 container init d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:15:26 compute-0 podman[97914]: 2025-12-01 09:15:26.086379714 +0000 UTC m=+0.109786705 container start d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:15:26 compute-0 adoring_banach[97930]: 167 167
Dec 01 09:15:26 compute-0 systemd[1]: libpod-d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9.scope: Deactivated successfully.
Dec 01 09:15:26 compute-0 podman[97914]: 2025-12-01 09:15:26.091584179 +0000 UTC m=+0.114991170 container attach d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:15:26 compute-0 podman[97914]: 2025-12-01 09:15:26.091861558 +0000 UTC m=+0.115268549 container died d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:15:26 compute-0 podman[97914]: 2025-12-01 09:15:25.997862005 +0000 UTC m=+0.021269016 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-70e0c6b6e1514faddd4232a8473fd5eb321c4883828105f80e0a3276a193aef6-merged.mount: Deactivated successfully.
Dec 01 09:15:26 compute-0 podman[97914]: 2025-12-01 09:15:26.129085739 +0000 UTC m=+0.152492730 container remove d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_banach, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:15:26 compute-0 systemd[1]: libpod-conmon-d8343bb0a5c99496b7688ce59c43d0e6101979424b516568f178045e0006f2b9.scope: Deactivated successfully.
Dec 01 09:15:26 compute-0 ceph-mon[75031]: pgmap v102: 193 pgs: 1 peering, 31 unknown, 161 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:26 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/480516295' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:15:26 compute-0 podman[97973]: 2025-12-01 09:15:26.320658528 +0000 UTC m=+0.076105116 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:26 compute-0 podman[97973]: 2025-12-01 09:15:26.464261705 +0000 UTC m=+0.219708253 container create 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:15:26 compute-0 systemd[1]: Started libpod-conmon-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope.
Dec 01 09:15:26 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:26 compute-0 podman[97973]: 2025-12-01 09:15:26.553587699 +0000 UTC m=+0.309034247 container init 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:15:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec 01 09:15:26 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1127224077' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:15:26 compute-0 relaxed_bohr[97888]: 
Dec 01 09:15:26 compute-0 relaxed_bohr[97888]: {"epoch":1,"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","modified":"2025-12-01T09:12:18.204879Z","created":"2025-12-01T09:12:18.204879Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"compute-0","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.122.100:3300","nonce":0},{"type":"v1","addr":"192.168.122.100:6789","nonce":0}]},"addr":"192.168.122.100:6789/0","public_addr":"192.168.122.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]}
Dec 01 09:15:26 compute-0 relaxed_bohr[97888]: dumped monmap epoch 1
Dec 01 09:15:26 compute-0 podman[97973]: 2025-12-01 09:15:26.562515702 +0000 UTC m=+0.317962250 container start 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:26 compute-0 podman[97973]: 2025-12-01 09:15:26.566814329 +0000 UTC m=+0.322260897 container attach 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:15:26 compute-0 systemd[1]: libpod-3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109.scope: Deactivated successfully.
Dec 01 09:15:26 compute-0 podman[97860]: 2025-12-01 09:15:26.578139868 +0000 UTC m=+0.814922320 container died 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:26 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Dec 01 09:15:26 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Dec 01 09:15:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-86c0ac1fae0c8f4087fdf12a1783d13fdbbedbc5ec3da1c971df73ea3cec59d3-merged.mount: Deactivated successfully.
Dec 01 09:15:26 compute-0 podman[97860]: 2025-12-01 09:15:26.626842034 +0000 UTC m=+0.863624456 container remove 3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109 (image=quay.io/ceph/ceph:v18, name=relaxed_bohr, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec 01 09:15:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec 01 09:15:26 compute-0 systemd[1]: libpod-conmon-3ff1d43e221027082e6433868ae81bafda9e747ce212972418b7e109bd477109.scope: Deactivated successfully.
Dec 01 09:15:26 compute-0 sudo[97814]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:27 compute-0 sudo[98032]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cukvfbbnrzhoxfnahkiwauewyqsbyaur ; /usr/bin/python3'
Dec 01 09:15:27 compute-0 sudo[98032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v103: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"} v 0) v1
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 python3[98034]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   auth get client.openstack _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e45 do_prune osdmap full prune enabled
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e46 e46: 3 total, 3 up, 3 in
Dec 01 09:15:27 compute-0 podman[98035]: 2025-12-01 09:15:27.258542159 +0000 UTC m=+0.074003590 container create 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:27 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e46: 3 total, 3 up, 3 in
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.065162659s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925071716s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053224564s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913154602s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.065100670s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925071716s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053148270s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913154602s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053172112s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913414001s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.053153992s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913414001s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.101153374s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961578369s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.101132393s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961578369s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100970268s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961547852s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100948334s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961547852s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052714348s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913444519s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052694321s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913444519s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100719452s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961563110s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100702286s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961563110s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100515366s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961517334s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052310944s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913497925s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100352287s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961570740s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099813461s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961418152s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099128723s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961410522s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051272392s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913627625s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099069595s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961448669s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051211357s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913597107s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098785400s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961402893s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051002502s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913635254s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050869942s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913658142s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098507881s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961387634s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050839424s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913719177s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098623276s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961585999s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050669670s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913757324s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098088264s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961196899s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061552048s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924736023s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097998619s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961250305s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061717033s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924995422s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097698212s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961128235s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097724915s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961204529s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097575188s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961067200s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061326981s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924926758s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1127224077' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: 4.3 scrub starts
Dec 01 09:15:27 compute-0 ceph-mon[75031]: 4.3 scrub ok
Dec 01 09:15:27 compute-0 ceph-mon[75031]: 3.4 scrub starts
Dec 01 09:15:27 compute-0 ceph-mon[75031]: 3.4 scrub ok
Dec 01 09:15:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073251724s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184906006s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098457336s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210304260s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103429794s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215400696s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072329521s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184417725s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072682381s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184875488s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072598457s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184898376s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071966171s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184402466s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097693443s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210258484s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071633339s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097608566s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210380554s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097500801s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210418701s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068525314s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094830513s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210655212s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094977379s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210678101s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094768524s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210685730s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097580910s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961235046s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097126961s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.960968018s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067185402s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183311462s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067219734s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183364868s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097393990s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961318970s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094496727s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210739136s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061881065s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.926002502s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060898781s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925033569s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065610886s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181999207s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096854210s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961059570s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060742378s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925109863s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060762405s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925170898s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060591698s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925079346s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094371796s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210922241s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066887856s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183380127s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066463470s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183166504s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068760872s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184867859s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094186783s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210975647s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064965248s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181869507s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093912125s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210968018s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065909386s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.182998657s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064443588s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181739807s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064523697s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181938171s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093406677s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210983276s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063798904s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181411743s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093309402s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210937500s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063556671s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181381226s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063310623s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181175232s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097242355s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215141296s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097195625s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215148926s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097122192s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097187996s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097031593s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057223320s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.175636292s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096585274s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063093185s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181732178s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063192368s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181236267s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666628838s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502311707s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045631409s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881553650s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045528412s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881576538s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666149139s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502319336s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045070648s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881378174s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045023918s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881462097s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665835381s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502403259s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044493675s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881225586s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044302940s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881187439s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665797234s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502769470s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043943405s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043830872s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881057739s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671725273s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509086609s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670864105s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508338928s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043417931s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880981445s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670290947s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508003235s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670177460s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508010864s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670031548s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508018494s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042176247s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880577087s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669865608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508308411s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042231560s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041958809s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669230461s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508064270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669622421s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508628845s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041983604s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669599533s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508674622s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041315079s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880500793s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669652939s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508903503s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041206360s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880538940s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669507027s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508911133s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041090965s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880569458s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040925980s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880439758s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669351578s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508926392s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040772438s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880477905s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034724236s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.874450684s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508941650s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669108391s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508995056s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669086456s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508987427s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040740967s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880767822s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668955803s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509048462s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040763855s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880889893s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 podman[98035]: 2025-12-01 09:15:27.220404508 +0000 UTC m=+0.035866019 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:15:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:15:27 compute-0 systemd[1]: Started libpod-conmon-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope.
Dec 01 09:15:27 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e315c7f38a48a103cce588254720edd7ef126726e0d19dc51abeacc0417919b0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e315c7f38a48a103cce588254720edd7ef126726e0d19dc51abeacc0417919b0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:27 compute-0 podman[98035]: 2025-12-01 09:15:27.562239746 +0000 UTC m=+0.377701237 container init 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:15:27 compute-0 podman[98035]: 2025-12-01 09:15:27.569590439 +0000 UTC m=+0.385051860 container start 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:15:27 compute-0 podman[98035]: 2025-12-01 09:15:27.57308191 +0000 UTC m=+0.388543401 container attach 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:15:27 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec 01 09:15:27 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec 01 09:15:27 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Dec 01 09:15:27 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Dec 01 09:15:27 compute-0 fervent_boyd[97989]: {
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:15:27 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "osd_id": 0,
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "type": "bluestore"
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:     },
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "osd_id": 1,
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "type": "bluestore"
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:     },
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "osd_id": 2,
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:         "type": "bluestore"
Dec 01 09:15:27 compute-0 fervent_boyd[97989]:     }
Dec 01 09:15:27 compute-0 fervent_boyd[97989]: }
Dec 01 09:15:27 compute-0 systemd[1]: libpod-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope: Deactivated successfully.
Dec 01 09:15:27 compute-0 systemd[1]: libpod-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope: Consumed 1.264s CPU time.
Dec 01 09:15:27 compute-0 podman[98083]: 2025-12-01 09:15:27.882332693 +0000 UTC m=+0.029853729 container died 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-379c11194edf416e878dc71ace62775bd6357f827bf6a76c2c0eb0144f65a58f-merged.mount: Deactivated successfully.
Dec 01 09:15:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:28 compute-0 podman[98083]: 2025-12-01 09:15:28.034056417 +0000 UTC m=+0.181577453 container remove 8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_boyd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:28 compute-0 systemd[1]: libpod-conmon-8a8d2c61cf918dd7b1cc87b58c5c08103fd445c2907770358f1646c0cd957f45.scope: Deactivated successfully.
Dec 01 09:15:28 compute-0 sudo[97835]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:15:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:15:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:28 compute-0 ceph-mgr[75324]: [progress INFO root] update: starting ev c16ab44f-2930-4319-a8cf-17cb81d3e674 (Updating mds.cephfs deployment (+1 -> 1))
Dec 01 09:15:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) v1
Dec 01 09:15:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 01 09:15:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:15:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:28 compute-0 ceph-mgr[75324]: [cephadm INFO cephadm.serve] Deploying daemon mds.cephfs.compute-0.hrlhzj on compute-0
Dec 01 09:15:28 compute-0 ceph-mgr[75324]: log_channel(cephadm) log [INF] : Deploying daemon mds.cephfs.compute-0.hrlhzj on compute-0
Dec 01 09:15:28 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event 6a1eadee-2fd7-4097-9f7f-4e2c6af1e403 (Global Recovery Event) in 10 seconds
Dec 01 09:15:28 compute-0 sudo[98117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:28 compute-0 sudo[98117]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:28 compute-0 sudo[98117]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:28 compute-0 sudo[98142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:28 compute-0 sudo[98142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.openstack"} v 0) v1
Dec 01 09:15:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1477409912' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec 01 09:15:28 compute-0 amazing_tharp[98055]: [client.openstack]
Dec 01 09:15:28 compute-0 amazing_tharp[98055]:         key = AQDWWy1pAAAAABAA0JvObGCkXGU+EEwqsvh/8w==
Dec 01 09:15:28 compute-0 amazing_tharp[98055]:         caps mgr = "allow *"
Dec 01 09:15:28 compute-0 amazing_tharp[98055]:         caps mon = "profile rbd"
Dec 01 09:15:28 compute-0 amazing_tharp[98055]:         caps osd = "profile rbd pool=vms, profile rbd pool=volumes, profile rbd pool=backups, profile rbd pool=images, profile rbd pool=cephfs.cephfs.meta, profile rbd pool=cephfs.cephfs.data"
Dec 01 09:15:28 compute-0 sudo[98142]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:28 compute-0 systemd[1]: libpod-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope: Deactivated successfully.
Dec 01 09:15:28 compute-0 conmon[98055]: conmon 9c67f89bb4483983731a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope/container/memory.events
Dec 01 09:15:28 compute-0 podman[98035]: 2025-12-01 09:15:28.25820229 +0000 UTC m=+1.073663721 container died 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:15:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e46 do_prune osdmap full prune enabled
Dec 01 09:15:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 e47: 3 total, 3 up, 3 in
Dec 01 09:15:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-e315c7f38a48a103cce588254720edd7ef126726e0d19dc51abeacc0417919b0-merged.mount: Deactivated successfully.
Dec 01 09:15:28 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e47: 3 total, 3 up, 3 in
Dec 01 09:15:28 compute-0 sudo[98169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:28 compute-0 sudo[98169]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:28 compute-0 sudo[98169]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:28 compute-0 sudo[98206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b
Dec 01 09:15:28 compute-0 sudo[98206]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:28 compute-0 ceph-mon[75031]: pgmap v103: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: osdmap e46: 3 total, 3 up, 3 in
Dec 01 09:15:28 compute-0 ceph-mon[75031]: 3.b scrub starts
Dec 01 09:15:28 compute-0 ceph-mon[75031]: 3.b scrub ok
Dec 01 09:15:28 compute-0 ceph-mon[75031]: 4.6 deep-scrub starts
Dec 01 09:15:28 compute-0 ceph-mon[75031]: 4.6 deep-scrub ok
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.hrlhzj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:28 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1477409912' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:15:28 compute-0 podman[98035]: 2025-12-01 09:15:28.417880647 +0000 UTC m=+1.233342078 container remove 9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584 (image=quay.io/ceph/ceph:v18, name=amazing_tharp, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:28 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Dec 01 09:15:28 compute-0 systemd[1]: libpod-conmon-9c67f89bb4483983731a0b281582fcda21c35396142d25c53d617df853aa5584.scope: Deactivated successfully.
Dec 01 09:15:28 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Dec 01 09:15:28 compute-0 sudo[98032]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:28 compute-0 podman[98271]: 2025-12-01 09:15:28.73565538 +0000 UTC m=+0.046459135 container create c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:28 compute-0 systemd[1]: Started libpod-conmon-c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce.scope.
Dec 01 09:15:28 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:28 compute-0 podman[98271]: 2025-12-01 09:15:28.791030067 +0000 UTC m=+0.101833842 container init c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:15:28 compute-0 podman[98271]: 2025-12-01 09:15:28.80213587 +0000 UTC m=+0.112939625 container start c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:28 compute-0 podman[98271]: 2025-12-01 09:15:28.8059419 +0000 UTC m=+0.116745666 container attach c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:15:28 compute-0 beautiful_yalow[98286]: 167 167
Dec 01 09:15:28 compute-0 podman[98271]: 2025-12-01 09:15:28.711213005 +0000 UTC m=+0.022016790 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:28 compute-0 systemd[1]: libpod-c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce.scope: Deactivated successfully.
Dec 01 09:15:28 compute-0 podman[98271]: 2025-12-01 09:15:28.808890534 +0000 UTC m=+0.119694289 container died c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-a8c1ac4085fa0fa47d047a733f6050cd0bf42faafa9103b66e23a1a898800365-merged.mount: Deactivated successfully.
Dec 01 09:15:28 compute-0 podman[98271]: 2025-12-01 09:15:28.94776121 +0000 UTC m=+0.258564965 container remove c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_yalow, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:28 compute-0 systemd[1]: libpod-conmon-c37528195e01e09f47095311302920de621390b8364535312f7d87fe4ab019ce.scope: Deactivated successfully.
Dec 01 09:15:28 compute-0 systemd[1]: Reloading.
Dec 01 09:15:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v106: 193 pgs: 46 peering, 147 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:29 compute-0 systemd-rc-local-generator[98334]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:15:29 compute-0 systemd-sysv-generator[98337]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:15:29 compute-0 systemd[1]: Reloading.
Dec 01 09:15:29 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 01 09:15:29 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec 01 09:15:29 compute-0 systemd-rc-local-generator[98490]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:15:29 compute-0 systemd-sysv-generator[98496]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:15:29 compute-0 ceph-mon[75031]: Deploying daemon mds.cephfs.compute-0.hrlhzj on compute-0
Dec 01 09:15:29 compute-0 ceph-mon[75031]: osdmap e47: 3 total, 3 up, 3 in
Dec 01 09:15:29 compute-0 ceph-mon[75031]: 2.c deep-scrub starts
Dec 01 09:15:29 compute-0 ceph-mon[75031]: 2.c deep-scrub ok
Dec 01 09:15:29 compute-0 systemd[1]: Starting Ceph mds.cephfs.compute-0.hrlhzj for 5620a9fb-e540-5250-a0e8-7aaad5347e3b...
Dec 01 09:15:29 compute-0 sudo[98534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdykhllsdkrddsiubzoaugrdxxrcwubt ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764580529.4487615-36621-159398386950128/async_wrapper.py j521419402164 30 /home/zuul/.ansible/tmp/ansible-tmp-1764580529.4487615-36621-159398386950128/AnsiballZ_command.py _'
Dec 01 09:15:29 compute-0 sudo[98534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:30 compute-0 ansible-async_wrapper.py[98543]: Invoked with j521419402164 30 /home/zuul/.ansible/tmp/ansible-tmp-1764580529.4487615-36621-159398386950128/AnsiballZ_command.py _
Dec 01 09:15:30 compute-0 ansible-async_wrapper.py[98585]: Starting module and watcher
Dec 01 09:15:30 compute-0 ansible-async_wrapper.py[98585]: Start watching 98586 (30)
Dec 01 09:15:30 compute-0 ansible-async_wrapper.py[98586]: Start module (98586)
Dec 01 09:15:30 compute-0 ansible-async_wrapper.py[98543]: Return async_wrapper task started.
Dec 01 09:15:30 compute-0 sudo[98534]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:30 compute-0 podman[98588]: 2025-12-01 09:15:30.232177597 +0000 UTC m=+0.048498910 container create bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00c160e10467ebfdc74554f624bdb1b974dd2288c23f65838fc3b97b9eb35861/merged/var/lib/ceph/mds/ceph-cephfs.compute-0.hrlhzj supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:30 compute-0 podman[98588]: 2025-12-01 09:15:30.289685741 +0000 UTC m=+0.106007074 container init bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:15:30 compute-0 podman[98588]: 2025-12-01 09:15:30.294961519 +0000 UTC m=+0.111282832 container start bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:30 compute-0 bash[98588]: bd39bff8d9d91d0b5e01eaef8e36288bce2634cdb58658e44a05dec668f72782
Dec 01 09:15:30 compute-0 podman[98588]: 2025-12-01 09:15:30.204132537 +0000 UTC m=+0.020453870 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:30 compute-0 systemd[1]: Started Ceph mds.cephfs.compute-0.hrlhzj for 5620a9fb-e540-5250-a0e8-7aaad5347e3b.
Dec 01 09:15:30 compute-0 python3[98587]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:30 compute-0 sudo[98206]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:30 compute-0 ceph-mds[98608]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:15:30 compute-0 ceph-mds[98608]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Dec 01 09:15:30 compute-0 ceph-mds[98608]: main not setting numa affinity
Dec 01 09:15:30 compute-0 ceph-mds[98608]: pidfile_write: ignore empty --pid-file
Dec 01 09:15:30 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mds-cephfs-compute-0-hrlhzj[98604]: starting mds.cephfs.compute-0.hrlhzj at 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:15:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:15:30 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 2 from mon.0
Dec 01 09:15:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec 01 09:15:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mgr[75324]: [progress INFO root] complete: finished ev c16ab44f-2930-4319-a8cf-17cb81d3e674 (Updating mds.cephfs deployment (+1 -> 1))
Dec 01 09:15:30 compute-0 ceph-mgr[75324]: [progress INFO root] Completed event c16ab44f-2930-4319-a8cf-17cb81d3e674 (Updating mds.cephfs deployment (+1 -> 1)) in 2 seconds
Dec 01 09:15:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mds_join_fs}] v 0) v1
Dec 01 09:15:30 compute-0 podman[98609]: 2025-12-01 09:15:30.413459839 +0000 UTC m=+0.069704423 container create ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0) v1
Dec 01 09:15:30 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec 01 09:15:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec 01 09:15:30 compute-0 systemd[1]: Started libpod-conmon-ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866.scope.
Dec 01 09:15:30 compute-0 podman[98609]: 2025-12-01 09:15:30.372730807 +0000 UTC m=+0.028975421 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:30 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:30 compute-0 sudo[98641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2d0685b03cbe14b6087e0188b14446f4ae223cdb47b171a1fd3529276a98ea0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2d0685b03cbe14b6087e0188b14446f4ae223cdb47b171a1fd3529276a98ea0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:30 compute-0 sudo[98641]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:30 compute-0 sudo[98641]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:30 compute-0 podman[98609]: 2025-12-01 09:15:30.500063757 +0000 UTC m=+0.156308371 container init ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:30 compute-0 podman[98609]: 2025-12-01 09:15:30.512329286 +0000 UTC m=+0.168573870 container start ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:15:30 compute-0 podman[98609]: 2025-12-01 09:15:30.516585701 +0000 UTC m=+0.172830295 container attach ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:30 compute-0 sudo[98671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:15:30 compute-0 sudo[98671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:30 compute-0 sudo[98671]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:30 compute-0 sudo[98697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:30 compute-0 sudo[98697]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:30 compute-0 sudo[98697]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:30 compute-0 sudo[98722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:30 compute-0 sudo[98722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:30 compute-0 sudo[98722]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:30 compute-0 sudo[98747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:30 compute-0 sudo[98747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:30 compute-0 sudo[98747]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:30 compute-0 sudo[98772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:15:30 compute-0 sudo[98772]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:30 compute-0 ceph-mon[75031]: pgmap v106: 193 pgs: 46 peering, 147 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:30 compute-0 ceph-mon[75031]: 4.b scrub starts
Dec 01 09:15:30 compute-0 ceph-mon[75031]: 4.b scrub ok
Dec 01 09:15:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v107: 193 pgs: 46 peering, 147 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:31 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14256 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:31 compute-0 friendly_meninsky[98661]: 
Dec 01 09:15:31 compute-0 friendly_meninsky[98661]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 01 09:15:31 compute-0 systemd[1]: libpod-ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866.scope: Deactivated successfully.
Dec 01 09:15:31 compute-0 podman[98609]: 2025-12-01 09:15:31.100963765 +0000 UTC m=+0.757208359 container died ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:15:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-e2d0685b03cbe14b6087e0188b14446f4ae223cdb47b171a1fd3529276a98ea0-merged.mount: Deactivated successfully.
Dec 01 09:15:31 compute-0 podman[98609]: 2025-12-01 09:15:31.160390851 +0000 UTC m=+0.816635435 container remove ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866 (image=quay.io/ceph/ceph:v18, name=friendly_meninsky, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:15:31 compute-0 systemd[1]: libpod-conmon-ee4b4dc89d7f40a1deb213d9e2427b9084190856408f5cd2d5cc10c40a8c4866.scope: Deactivated successfully.
Dec 01 09:15:31 compute-0 ansible-async_wrapper.py[98586]: Module complete (98586)
Dec 01 09:15:31 compute-0 sudo[98938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbcoenzupijtnkjaofapdgpofrxiufwh ; /usr/bin/python3'
Dec 01 09:15:31 compute-0 sudo[98938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 new map
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 print_map
                                           e3
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:15:19.277019+0000
                                           modified        2025-12-01T09:15:19.277097+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.hrlhzj{-1:14254} state up:standby seq 1 addr [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] compat {c=[1],r=[1],i=[7ff]}]
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 3 from mon.0
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Monitors have assigned me to become a standby.
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mds.? [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] up:boot
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 assigned standby [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] as mds.0
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : daemon mds.cephfs.compute-0.hrlhzj assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:0 1 up:standby
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata", "who": "cephfs.compute-0.hrlhzj"} v 0) v1
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.hrlhzj"}]: dispatch
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e3 all = 0
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e4 new map
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e4 print_map
                                           e4
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:15:19.277019+0000
                                           modified        2025-12-01T09:15:31.359857+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=14254}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-0.hrlhzj{0:14254} state up:creating seq 1 addr [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=cephfs.compute-0.hrlhzj=up:creating}
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 4 from mon.0
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.4 handle_mds_map i am now mds.0.4
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x1
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x100
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x600
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x601
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x602
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x603
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x604
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x605
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x606
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x607
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x608
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.cache creating system inode with ino:0x609
Dec 01 09:15:31 compute-0 ceph-mds[98608]: mds.0.4 creating_done
Dec 01 09:15:31 compute-0 ceph-mon[75031]: log_channel(cluster) log [INF] : daemon mds.cephfs.compute-0.hrlhzj is now active in filesystem cephfs as rank 0
Dec 01 09:15:31 compute-0 podman[98953]: 2025-12-01 09:15:31.435729407 +0000 UTC m=+0.062605187 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:31 compute-0 python3[98940]: ansible-ansible.legacy.async_status Invoked with jid=j521419402164.98543 mode=status _async_dir=/root/.ansible_async
Dec 01 09:15:31 compute-0 sudo[98938]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:31 compute-0 podman[98953]: 2025-12-01 09:15:31.54579638 +0000 UTC m=+0.172672140 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:31 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.c scrub starts
Dec 01 09:15:31 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.c scrub ok
Dec 01 09:15:31 compute-0 sudo[99047]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyjlaeprmrsfrmwxxtahmlpmsmwqdzfc ; /usr/bin/python3'
Dec 01 09:15:31 compute-0 sudo[99047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:31 compute-0 python3[99056]: ansible-ansible.legacy.async_status Invoked with jid=j521419402164.98543 mode=cleanup _async_dir=/root/.ansible_async
Dec 01 09:15:31 compute-0 sudo[99047]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:31 compute-0 ceph-mon[75031]: 2.e scrub starts
Dec 01 09:15:31 compute-0 ceph-mon[75031]: 2.e scrub ok
Dec 01 09:15:31 compute-0 ceph-mon[75031]: mds.? [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] up:boot
Dec 01 09:15:31 compute-0 ceph-mon[75031]: daemon mds.cephfs.compute-0.hrlhzj assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec 01 09:15:31 compute-0 ceph-mon[75031]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec 01 09:15:31 compute-0 ceph-mon[75031]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec 01 09:15:31 compute-0 ceph-mon[75031]: Cluster is now healthy
Dec 01 09:15:31 compute-0 ceph-mon[75031]: fsmap cephfs:0 1 up:standby
Dec 01 09:15:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.hrlhzj"}]: dispatch
Dec 01 09:15:31 compute-0 ceph-mon[75031]: fsmap cephfs:1 {0=cephfs.compute-0.hrlhzj=up:creating}
Dec 01 09:15:31 compute-0 ceph-mon[75031]: daemon mds.cephfs.compute-0.hrlhzj is now active in filesystem cephfs as rank 0
Dec 01 09:15:32 compute-0 sudo[98772]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:32 compute-0 sudo[99190]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwjleszptpdoxqafveoqfsafsqoedepk ; /usr/bin/python3'
Dec 01 09:15:32 compute-0 sudo[99190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:32 compute-0 sudo[99155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:32 compute-0 sudo[99155]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:32 compute-0 sudo[99155]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:32 compute-0 sudo[99200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:32 compute-0 sudo[99200]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:32 compute-0 sudo[99200]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:32 compute-0 sudo[99225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:32 compute-0 sudo[99225]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:32 compute-0 sudo[99225]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:32 compute-0 python3[99197]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e5 new map
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).mds e5 print_map
                                           e5
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:15:19.277019+0000
                                           modified        2025-12-01T09:15:32.363675+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=14254}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-0.hrlhzj{0:14254} state up:active seq 2 join_fscid=1 addr [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
Dec 01 09:15:32 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj Updating MDS map to version 5 from mon.0
Dec 01 09:15:32 compute-0 ceph-mds[98608]: mds.0.4 handle_mds_map i am now mds.0.4
Dec 01 09:15:32 compute-0 ceph-mds[98608]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec 01 09:15:32 compute-0 ceph-mds[98608]: mds.0.4 recovery_done -- successful recovery!
Dec 01 09:15:32 compute-0 ceph-mds[98608]: mds.0.4 active_start
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : mds.? [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] up:active
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=cephfs.compute-0.hrlhzj=up:active}
Dec 01 09:15:32 compute-0 sudo[99251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:15:32 compute-0 sudo[99251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:32 compute-0 podman[99250]: 2025-12-01 09:15:32.384269306 +0000 UTC m=+0.030241070 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:32 compute-0 podman[99250]: 2025-12-01 09:15:32.483633978 +0000 UTC m=+0.129605702 container create e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:32 compute-0 systemd[1]: Started libpod-conmon-e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78.scope.
Dec 01 09:15:32 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6b7fbf8bbcb4a8db419853aac851376744b8b07f6f4493d257b013328d49840/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6b7fbf8bbcb4a8db419853aac851376744b8b07f6f4493d257b013328d49840/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:32 compute-0 podman[99250]: 2025-12-01 09:15:32.554930511 +0000 UTC m=+0.200902245 container init e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:15:32 compute-0 podman[99250]: 2025-12-01 09:15:32.561632883 +0000 UTC m=+0.207604607 container start e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:15:32 compute-0 podman[99250]: 2025-12-01 09:15:32.564847936 +0000 UTC m=+0.210819660 container attach e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:32 compute-0 ceph-mon[75031]: pgmap v107: 193 pgs: 46 peering, 147 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:32 compute-0 ceph-mon[75031]: from='client.14256 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:32 compute-0 ceph-mon[75031]: 4.c scrub starts
Dec 01 09:15:32 compute-0 ceph-mon[75031]: 4.c scrub ok
Dec 01 09:15:32 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:32 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mds.? [v2:192.168.122.100:6814/1560103012,v1:192.168.122.100:6815/1560103012] up:active
Dec 01 09:15:32 compute-0 ceph-mon[75031]: fsmap cephfs:1 {0=cephfs.compute-0.hrlhzj=up:active}
Dec 01 09:15:32 compute-0 sudo[99251]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:32 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 8f31d491-da8e-4f01-8c16-8a37e79ad638 does not exist
Dec 01 09:15:32 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev cd2a7ad4-5e6a-404c-b778-2976ccd148f6 does not exist
Dec 01 09:15:32 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 79bcb737-6fe1-4b25-a9f9-4578d8521ef9 does not exist
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:15:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v108: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 127 B/s wr, 1 op/s
Dec 01 09:15:33 compute-0 sudo[99345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:33 compute-0 sudo[99345]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:33 compute-0 sudo[99345]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:33 compute-0 sudo[99370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:33 compute-0 sudo[99370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:33 compute-0 sudo[99370]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:33 compute-0 ceph-mgr[75324]: [progress INFO root] Writing back 11 completed events
Dec 01 09:15:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) v1
Dec 01 09:15:33 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:33 compute-0 sudo[99395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:33 compute-0 sudo[99395]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:33 compute-0 sudo[99395]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:33 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14258 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:33 compute-0 gifted_blackwell[99299]: 
Dec 01 09:15:33 compute-0 gifted_blackwell[99299]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 01 09:15:33 compute-0 systemd[1]: libpod-e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78.scope: Deactivated successfully.
Dec 01 09:15:33 compute-0 podman[99250]: 2025-12-01 09:15:33.214869692 +0000 UTC m=+0.860841416 container died e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:15:33 compute-0 sudo[99420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:15:33 compute-0 sudo[99420]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-b6b7fbf8bbcb4a8db419853aac851376744b8b07f6f4493d257b013328d49840-merged.mount: Deactivated successfully.
Dec 01 09:15:33 compute-0 podman[99250]: 2025-12-01 09:15:33.315635229 +0000 UTC m=+0.961606943 container remove e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78 (image=quay.io/ceph/ceph:v18, name=gifted_blackwell, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:15:33 compute-0 systemd[1]: libpod-conmon-e38eb15d455f115143b70f068f8c6245e5a0df5ef55689d672d0a44a547efe78.scope: Deactivated successfully.
Dec 01 09:15:33 compute-0 sudo[99190]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:33 compute-0 podman[99500]: 2025-12-01 09:15:33.539305797 +0000 UTC m=+0.046159836 container create eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Dec 01 09:15:33 compute-0 systemd[1]: Started libpod-conmon-eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de.scope.
Dec 01 09:15:33 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:33 compute-0 podman[99500]: 2025-12-01 09:15:33.521310736 +0000 UTC m=+0.028164795 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:33 compute-0 podman[99500]: 2025-12-01 09:15:33.677705059 +0000 UTC m=+0.184559148 container init eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:15:33 compute-0 podman[99500]: 2025-12-01 09:15:33.683284806 +0000 UTC m=+0.190138845 container start eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:15:33 compute-0 podman[99500]: 2025-12-01 09:15:33.686716374 +0000 UTC m=+0.193570423 container attach eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:33 compute-0 nervous_jemison[99516]: 167 167
Dec 01 09:15:33 compute-0 systemd[1]: libpod-eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de.scope: Deactivated successfully.
Dec 01 09:15:33 compute-0 podman[99500]: 2025-12-01 09:15:33.688601964 +0000 UTC m=+0.195456043 container died eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-16ffc648c1d1975b8cc2180fa4b84de8979d1623c4d8c5401adec8b02702e082-merged.mount: Deactivated successfully.
Dec 01 09:15:33 compute-0 podman[99500]: 2025-12-01 09:15:33.729071648 +0000 UTC m=+0.235925687 container remove eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_jemison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:33 compute-0 systemd[1]: libpod-conmon-eda283c26f0b7892c7258c91f7b274d95c0ffb8591ca03566bb60325c4cc18de.scope: Deactivated successfully.
Dec 01 09:15:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:15:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:15:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:15:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:33 compute-0 podman[99540]: 2025-12-01 09:15:33.888478367 +0000 UTC m=+0.039873807 container create 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:15:33 compute-0 systemd[1]: Started libpod-conmon-128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063.scope.
Dec 01 09:15:33 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:33 compute-0 podman[99540]: 2025-12-01 09:15:33.870696822 +0000 UTC m=+0.022092302 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:33 compute-0 podman[99540]: 2025-12-01 09:15:33.979438673 +0000 UTC m=+0.130834133 container init 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:33 compute-0 podman[99540]: 2025-12-01 09:15:33.98595726 +0000 UTC m=+0.137352710 container start 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:15:33 compute-0 podman[99540]: 2025-12-01 09:15:33.989678568 +0000 UTC m=+0.141074018 container attach 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:15:34 compute-0 sudo[99585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzkoeynspyxksojnzoqdbefswpodgzby ; /usr/bin/python3'
Dec 01 09:15:34 compute-0 sudo[99585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:34 compute-0 python3[99587]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch ls --export -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:34 compute-0 podman[99588]: 2025-12-01 09:15:34.235104356 +0000 UTC m=+0.039735742 container create f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:15:34 compute-0 systemd[1]: Started libpod-conmon-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope.
Dec 01 09:15:34 compute-0 podman[99588]: 2025-12-01 09:15:34.217012652 +0000 UTC m=+0.021644058 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68a9202e7b1ba306129f0cef1a1ffcc19f77dfd6d13a79c314edc07f54d3db6/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68a9202e7b1ba306129f0cef1a1ffcc19f77dfd6d13a79c314edc07f54d3db6/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:34 compute-0 podman[99588]: 2025-12-01 09:15:34.359803463 +0000 UTC m=+0.164434869 container init f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:34 compute-0 podman[99588]: 2025-12-01 09:15:34.365262866 +0000 UTC m=+0.169894252 container start f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:34 compute-0 podman[99588]: 2025-12-01 09:15:34.36855409 +0000 UTC m=+0.173185496 container attach f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:15:34 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub starts
Dec 01 09:15:34 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub ok
Dec 01 09:15:34 compute-0 ceph-mon[75031]: pgmap v108: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 127 B/s wr, 1 op/s
Dec 01 09:15:34 compute-0 ceph-mon[75031]: from='client.14258 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:34 compute-0 ceph-mon[75031]: 3.d deep-scrub starts
Dec 01 09:15:34 compute-0 ceph-mon[75031]: 3.d deep-scrub ok
Dec 01 09:15:34 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14260 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:34 compute-0 exciting_goldwasser[99604]: 
Dec 01 09:15:34 compute-0 exciting_goldwasser[99604]: [{"placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash"}, {"placement": {"hosts": ["compute-0"]}, "service_id": "cephfs", "service_name": "mds.cephfs", "service_type": "mds"}, {"placement": {"hosts": ["compute-0"]}, "service_name": "mgr", "service_type": "mgr"}, {"placement": {"hosts": ["compute-0"]}, "service_name": "mon", "service_type": "mon"}, {"placement": {"hosts": ["compute-0"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1", "/dev/ceph_vg2/ceph_lv2"]}, "filter_logic": "AND", "objectstore": "bluestore"}}]
Dec 01 09:15:34 compute-0 systemd[1]: libpod-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope: Deactivated successfully.
Dec 01 09:15:34 compute-0 conmon[99604]: conmon f767a7517ca2e52adef8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope/container/memory.events
Dec 01 09:15:34 compute-0 podman[99588]: 2025-12-01 09:15:34.983422461 +0000 UTC m=+0.788053847 container died f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:15:34 compute-0 funny_ritchie[99557]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:15:34 compute-0 funny_ritchie[99557]: --> relative data size: 1.0
Dec 01 09:15:34 compute-0 funny_ritchie[99557]: --> All data devices are unavailable
Dec 01 09:15:35 compute-0 systemd[1]: libpod-128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063.scope: Deactivated successfully.
Dec 01 09:15:35 compute-0 podman[99540]: 2025-12-01 09:15:35.036249348 +0000 UTC m=+1.187644798 container died 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:15:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v109: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 127 B/s wr, 1 op/s
Dec 01 09:15:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-a68a9202e7b1ba306129f0cef1a1ffcc19f77dfd6d13a79c314edc07f54d3db6-merged.mount: Deactivated successfully.
Dec 01 09:15:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-ab0354d0c607acd3b0ed1cfb2916043e3b0d68fee654fc58b100b09320876852-merged.mount: Deactivated successfully.
Dec 01 09:15:35 compute-0 podman[99588]: 2025-12-01 09:15:35.076947839 +0000 UTC m=+0.881579225 container remove f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797 (image=quay.io/ceph/ceph:v18, name=exciting_goldwasser, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:15:35 compute-0 systemd[1]: libpod-conmon-f767a7517ca2e52adef814ced4c4a165fc7f6dc0e28afd9840c0a05b183e8797.scope: Deactivated successfully.
Dec 01 09:15:35 compute-0 sudo[99585]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:35 compute-0 podman[99540]: 2025-12-01 09:15:35.103406929 +0000 UTC m=+1.254802379 container remove 128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_ritchie, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:35 compute-0 systemd[1]: libpod-conmon-128769043fb93c1cf426f6dba2e3a5629ab3d4cf6caa344ca95e647f1c154063.scope: Deactivated successfully.
Dec 01 09:15:35 compute-0 sudo[99420]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:35 compute-0 ansible-async_wrapper.py[98585]: Done in kid B.
Dec 01 09:15:35 compute-0 sudo[99679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:35 compute-0 sudo[99679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:35 compute-0 sudo[99679]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:35 compute-0 sudo[99704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:35 compute-0 sudo[99704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:35 compute-0 sudo[99704]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:35 compute-0 sudo[99729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:35 compute-0 sudo[99729]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:35 compute-0 sudo[99729]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:35 compute-0 sudo[99754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:15:35 compute-0 sudo[99754]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:35 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub starts
Dec 01 09:15:35 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub ok
Dec 01 09:15:35 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec 01 09:15:35 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec 01 09:15:35 compute-0 podman[99818]: 2025-12-01 09:15:35.78952376 +0000 UTC m=+0.057304519 container create 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:15:35 compute-0 systemd[1]: Started libpod-conmon-432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e.scope.
Dec 01 09:15:35 compute-0 podman[99818]: 2025-12-01 09:15:35.760689315 +0000 UTC m=+0.028470174 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:35 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:35 compute-0 podman[99818]: 2025-12-01 09:15:35.886985413 +0000 UTC m=+0.154766212 container init 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:15:35 compute-0 podman[99818]: 2025-12-01 09:15:35.895257045 +0000 UTC m=+0.163037814 container start 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:35 compute-0 podman[99818]: 2025-12-01 09:15:35.899321564 +0000 UTC m=+0.167102363 container attach 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:35 compute-0 determined_vaughan[99840]: 167 167
Dec 01 09:15:35 compute-0 systemd[1]: libpod-432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e.scope: Deactivated successfully.
Dec 01 09:15:35 compute-0 podman[99818]: 2025-12-01 09:15:35.902858027 +0000 UTC m=+0.170638796 container died 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:15:35 compute-0 sudo[99860]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqwclizmsrrbyktzjvoqgfctixjnsvmf ; /usr/bin/python3'
Dec 01 09:15:35 compute-0 sudo[99860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:35 compute-0 ceph-mon[75031]: from='client.14260 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:35 compute-0 ceph-mon[75031]: 3.10 scrub starts
Dec 01 09:15:35 compute-0 ceph-mon[75031]: 3.10 scrub ok
Dec 01 09:15:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-af16760c2bd1c566ef056dabf05a1620c771446645ac8be5c8f0fae3e1157f94-merged.mount: Deactivated successfully.
Dec 01 09:15:35 compute-0 podman[99818]: 2025-12-01 09:15:35.940998227 +0000 UTC m=+0.208778996 container remove 432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_vaughan, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:35 compute-0 systemd[1]: libpod-conmon-432f31dc887626d130a90a853221b7e562666301831a56fbb9ac9ad96f9faf3e.scope: Deactivated successfully.
Dec 01 09:15:36 compute-0 python3[99865]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch ps -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:36 compute-0 podman[99883]: 2025-12-01 09:15:36.152846078 +0000 UTC m=+0.089790800 container create dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:36 compute-0 podman[99883]: 2025-12-01 09:15:36.092200384 +0000 UTC m=+0.029145086 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:36 compute-0 systemd[1]: Started libpod-conmon-dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e.scope.
Dec 01 09:15:36 compute-0 podman[99895]: 2025-12-01 09:15:36.212412688 +0000 UTC m=+0.116093055 container create 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:15:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:36 compute-0 systemd[1]: Started libpod-conmon-61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b.scope.
Dec 01 09:15:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9998c787f466d491288c66d6cd37a30bf6ca01dbd2187f78fb60a702b52e24f/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9998c787f466d491288c66d6cd37a30bf6ca01dbd2187f78fb60a702b52e24f/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:36 compute-0 podman[99883]: 2025-12-01 09:15:36.262178687 +0000 UTC m=+0.199123409 container init dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:15:36 compute-0 podman[99883]: 2025-12-01 09:15:36.270588604 +0000 UTC m=+0.207533306 container start dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:36 compute-0 podman[99895]: 2025-12-01 09:15:36.178790581 +0000 UTC m=+0.082471008 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:36 compute-0 podman[99895]: 2025-12-01 09:15:36.27550838 +0000 UTC m=+0.179188807 container init 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:36 compute-0 podman[99895]: 2025-12-01 09:15:36.281934834 +0000 UTC m=+0.185615231 container start 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:36 compute-0 podman[99883]: 2025-12-01 09:15:36.282377248 +0000 UTC m=+0.219321960 container attach dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:36 compute-0 podman[99895]: 2025-12-01 09:15:36.286964564 +0000 UTC m=+0.190644941 container attach 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:15:36 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:36 compute-0 vibrant_agnesi[99917]: 
Dec 01 09:15:36 compute-0 vibrant_agnesi[99917]: [{"container_id": "83d60e6b432c", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "0.39%", "created": "2025-12-01T09:13:40.947545Z", "daemon_id": "compute-0", "daemon_name": "crash.compute-0", "daemon_type": "crash", "events": ["2025-12-01T09:13:41.007313Z daemon:crash.compute-0 [INFO] \"Deployed crash.compute-0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118088Z", "memory_usage": 11586764, "ports": [], "service_name": "crash", "started": "2025-12-01T09:13:40.847848Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@crash.compute-0", "version": "18.2.7"}, {"container_id": "bd39bff8d9d9", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "6.98%", "created": "2025-12-01T09:15:30.308898Z", "daemon_id": "cephfs.compute-0.hrlhzj", "daemon_name": "mds.cephfs.compute-0.hrlhzj", "daemon_type": "mds", "events": ["2025-12-01T09:15:30.360426Z daemon:mds.cephfs.compute-0.hrlhzj [INFO] \"Deployed mds.cephfs.compute-0.hrlhzj on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118409Z", "memory_usage": 15833497, "ports": [], "service_name": "mds.cephfs", "started": "2025-12-01T09:15:30.209936Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mds.cephfs.compute-0.hrlhzj", "version": "18.2.7"}, {"container_id": "d04e39f95959", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph:v18", "cpu_percentage": "26.87%", "created": "2025-12-01T09:12:25.057330Z", "daemon_id": "compute-0.psduho", "daemon_name": "mgr.compute-0.psduho", "daemon_type": "mgr", "events": ["2025-12-01T09:14:43.592702Z daemon:mgr.compute-0.psduho [INFO] \"Reconfigured mgr.compute-0.psduho on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118023Z", "memory_usage": 549873254, "ports": [9283, 8765], "service_name": "mgr", "started": "2025-12-01T09:12:24.972967Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mgr.compute-0.psduho", "version": "18.2.7"}, {"container_id": "a46df485ce4f", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph:v18", "cpu_percentage": "1.96%", "created": "2025-12-01T09:12:20.053188Z", "daemon_id": "compute-0", "daemon_name": "mon.compute-0", "daemon_type": "mon", "events": ["2025-12-01T09:14:42.780063Z daemon:mon.compute-0 [INFO] \"Reconfigured mon.compute-0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.117922Z", "memory_request": 2147483648, "memory_usage": 39992688, "ports": [], "service_name": "mon", "started": "2025-12-01T09:12:22.666870Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@mon.compute-0", "version": "18.2.7"}, {"container_id": "b27d497db5b1", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "1.95%", "created": "2025-12-01T09:14:08.958385Z", "daemon_id": "0", "daemon_name": "osd.0", "daemon_type": "osd", "events": ["2025-12-01T09:14:09.023175Z daemon:osd.0 [INFO] \"Deployed osd.0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118154Z", "memory_request": 4294967296, "memory_usage": 67360522, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-01T09:14:08.869614Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@osd.0", "version": "18.2.7"}, {"container_id": "2203330e3b4c", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "2.25%", "created": "2025-12-01T09:14:13.757531Z", "daemon_id": "1", "daemon_name": "osd.1", "daemon_type": "osd", "events": ["2025-12-01T09:14:13.869959Z daemon:osd.1 [INFO] \"Deployed osd.1 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118215Z", "memory_request": 4294967296, "memory_usage": 66007859, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-01T09:14:13.345532Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@osd.1", "version": "18.2.7"}, {"container_id": "b8cc745a8217", "container_image_digests": ["quay.io/ceph/ceph@sha256:7d8bb82696d5d9cbeae2a2828dc12b6835aa2dded890fa3ac5a733cb66b72b1c", "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0"], "container_image_id": "0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1", "container_image_name": "quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0", "cpu_percentage": "3.05%", "created": "2025-12-01T09:14:22.409795Z", "daemon_id": "2", "daemon_name": "osd.2", "daemon_type": "osd", "events": ["2025-12-01T09:14:22.523036Z daemon:osd.2 [INFO] \"Deployed osd.2 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-01T09:15:32.118274Z", "memory_request": 4294967296, "memory_usage": 66175631, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-01T09:14:22.192700Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b@osd.2", "version": "18.2.7"}]
Dec 01 09:15:36 compute-0 systemd[1]: libpod-61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b.scope: Deactivated successfully.
Dec 01 09:15:36 compute-0 podman[99895]: 2025-12-01 09:15:36.84749373 +0000 UTC m=+0.751174137 container died 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-f9998c787f466d491288c66d6cd37a30bf6ca01dbd2187f78fb60a702b52e24f-merged.mount: Deactivated successfully.
Dec 01 09:15:36 compute-0 podman[99895]: 2025-12-01 09:15:36.900891155 +0000 UTC m=+0.804571532 container remove 61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b (image=quay.io/ceph/ceph:v18, name=vibrant_agnesi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:36 compute-0 systemd[1]: libpod-conmon-61a2d060f4275bef784e4618e19cfcbe36c90a10ea88c18622b2be68d69b4f9b.scope: Deactivated successfully.
Dec 01 09:15:36 compute-0 sudo[99860]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:36 compute-0 ceph-mon[75031]: pgmap v109: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 127 B/s wr, 1 op/s
Dec 01 09:15:36 compute-0 ceph-mon[75031]: 2.10 deep-scrub starts
Dec 01 09:15:36 compute-0 ceph-mon[75031]: 2.10 deep-scrub ok
Dec 01 09:15:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v110: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s wr, 4 op/s
Dec 01 09:15:37 compute-0 lucid_banach[99912]: {
Dec 01 09:15:37 compute-0 lucid_banach[99912]:     "0": [
Dec 01 09:15:37 compute-0 lucid_banach[99912]:         {
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "devices": [
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "/dev/loop3"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             ],
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_name": "ceph_lv0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_size": "21470642176",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "name": "ceph_lv0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "tags": {
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.crush_device_class": "",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.encrypted": "0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osd_id": "0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.type": "block",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.vdo": "0"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             },
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "type": "block",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "vg_name": "ceph_vg0"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:         }
Dec 01 09:15:37 compute-0 lucid_banach[99912]:     ],
Dec 01 09:15:37 compute-0 lucid_banach[99912]:     "1": [
Dec 01 09:15:37 compute-0 lucid_banach[99912]:         {
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "devices": [
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "/dev/loop4"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             ],
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_name": "ceph_lv1",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_size": "21470642176",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "name": "ceph_lv1",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "tags": {
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.crush_device_class": "",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.encrypted": "0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osd_id": "1",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.type": "block",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.vdo": "0"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             },
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "type": "block",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "vg_name": "ceph_vg1"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:         }
Dec 01 09:15:37 compute-0 lucid_banach[99912]:     ],
Dec 01 09:15:37 compute-0 lucid_banach[99912]:     "2": [
Dec 01 09:15:37 compute-0 lucid_banach[99912]:         {
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "devices": [
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "/dev/loop5"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             ],
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_name": "ceph_lv2",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_size": "21470642176",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "name": "ceph_lv2",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "tags": {
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.crush_device_class": "",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.encrypted": "0",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osd_id": "2",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.type": "block",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:                 "ceph.vdo": "0"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             },
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "type": "block",
Dec 01 09:15:37 compute-0 lucid_banach[99912]:             "vg_name": "ceph_vg2"
Dec 01 09:15:37 compute-0 lucid_banach[99912]:         }
Dec 01 09:15:37 compute-0 lucid_banach[99912]:     ]
Dec 01 09:15:37 compute-0 lucid_banach[99912]: }
Dec 01 09:15:37 compute-0 systemd[1]: libpod-dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e.scope: Deactivated successfully.
Dec 01 09:15:37 compute-0 podman[99883]: 2025-12-01 09:15:37.0970578 +0000 UTC m=+1.034002512 container died dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 01 09:15:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-c4e3c611eb7fe346f2906fcf4480a112e83c7875b135c99ff80b639b22dd8cf8-merged.mount: Deactivated successfully.
Dec 01 09:15:37 compute-0 podman[99883]: 2025-12-01 09:15:37.156067942 +0000 UTC m=+1.093012644 container remove dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_banach, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:37 compute-0 systemd[1]: libpod-conmon-dc04a5d4e8c10377c3b7536f78a2a8580c8937c161be4a91502d3dbe0421017e.scope: Deactivated successfully.
Dec 01 09:15:37 compute-0 sudo[99754]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:37 compute-0 sudo[99973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:37 compute-0 sudo[99973]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:37 compute-0 sudo[99973]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:37 compute-0 sudo[99998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:37 compute-0 sudo[99998]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:37 compute-0 sudo[99998]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:37 compute-0 sudo[100023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:37 compute-0 sudo[100023]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:37 compute-0 sudo[100023]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:37 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub starts
Dec 01 09:15:37 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub ok
Dec 01 09:15:37 compute-0 sudo[100048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:15:37 compute-0 sudo[100048]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Dec 01 09:15:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Dec 01 09:15:37 compute-0 podman[100114]: 2025-12-01 09:15:37.760494352 +0000 UTC m=+0.044056689 container create 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:15:37 compute-0 sudo[100149]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmkoglcctbambgwofmukdizvuoxbkzt ; /usr/bin/python3'
Dec 01 09:15:37 compute-0 sudo[100149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:37 compute-0 systemd[1]: Started libpod-conmon-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope.
Dec 01 09:15:37 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:37 compute-0 podman[100114]: 2025-12-01 09:15:37.740802137 +0000 UTC m=+0.024364524 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:37 compute-0 podman[100114]: 2025-12-01 09:15:37.835884154 +0000 UTC m=+0.119446521 container init 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:15:37 compute-0 podman[100114]: 2025-12-01 09:15:37.84491181 +0000 UTC m=+0.128474157 container start 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:15:37 compute-0 podman[100114]: 2025-12-01 09:15:37.848854856 +0000 UTC m=+0.132417223 container attach 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:37 compute-0 vigorous_yonath[100156]: 167 167
Dec 01 09:15:37 compute-0 systemd[1]: libpod-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope: Deactivated successfully.
Dec 01 09:15:37 compute-0 conmon[100156]: conmon 4834e8f5f358af76f540 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope/container/memory.events
Dec 01 09:15:37 compute-0 podman[100114]: 2025-12-01 09:15:37.852314045 +0000 UTC m=+0.135876402 container died 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-e00d8f7c9645ade1560c28813eb79379342aa5928e31bf433b939565a5f2fb6f-merged.mount: Deactivated successfully.
Dec 01 09:15:37 compute-0 podman[100114]: 2025-12-01 09:15:37.901026931 +0000 UTC m=+0.184589268 container remove 4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:37 compute-0 systemd[1]: libpod-conmon-4834e8f5f358af76f540ebd171a5d33081724bf82622e719a92c3389fd541a8e.scope: Deactivated successfully.
Dec 01 09:15:37 compute-0 python3[100153]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   -s -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:37 compute-0 ceph-mon[75031]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:15:37 compute-0 ceph-mon[75031]: 4.15 scrub starts
Dec 01 09:15:37 compute-0 ceph-mon[75031]: 4.15 scrub ok
Dec 01 09:15:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:37 compute-0 podman[100175]: 2025-12-01 09:15:37.991355167 +0000 UTC m=+0.042544941 container create c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:15:38 compute-0 systemd[1]: Started libpod-conmon-c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43.scope.
Dec 01 09:15:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3994441c15770f7bcac96802baa5b10446392f18724c28602e22f1b1d11c642c/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3994441c15770f7bcac96802baa5b10446392f18724c28602e22f1b1d11c642c/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:38 compute-0 podman[100175]: 2025-12-01 09:15:38.065225291 +0000 UTC m=+0.116415085 container init c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:15:38 compute-0 podman[100175]: 2025-12-01 09:15:37.972137468 +0000 UTC m=+0.023327262 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:38 compute-0 podman[100195]: 2025-12-01 09:15:38.069891679 +0000 UTC m=+0.048821690 container create 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 01 09:15:38 compute-0 podman[100175]: 2025-12-01 09:15:38.071260293 +0000 UTC m=+0.122450067 container start c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:15:38 compute-0 podman[100175]: 2025-12-01 09:15:38.074148905 +0000 UTC m=+0.125338709 container attach c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:15:38 compute-0 systemd[1]: Started libpod-conmon-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope.
Dec 01 09:15:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:38 compute-0 podman[100195]: 2025-12-01 09:15:38.046791516 +0000 UTC m=+0.025721547 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:38 compute-0 podman[100195]: 2025-12-01 09:15:38.153149241 +0000 UTC m=+0.132079272 container init 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:15:38 compute-0 podman[100195]: 2025-12-01 09:15:38.161130355 +0000 UTC m=+0.140060366 container start 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:15:38 compute-0 podman[100195]: 2025-12-01 09:15:38.164358797 +0000 UTC m=+0.143288828 container attach 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:38 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Dec 01 09:15:38 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Dec 01 09:15:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0) v1
Dec 01 09:15:38 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3091290908' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:15:38 compute-0 confident_chatelet[100202]: 
Dec 01 09:15:38 compute-0 confident_chatelet[100202]: {"fsid":"5620a9fb-e540-5250-a0e8-7aaad5347e3b","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":195,"monmap":{"epoch":1,"min_mon_release_name":"reef","num_mons":1},"osdmap":{"epoch":47,"num_osds":3,"num_up_osds":3,"osd_up_since":1764580474,"num_in_osds":3,"osd_in_since":1764580437,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":193}],"num_pgs":193,"num_pools":7,"num_objects":23,"data_bytes":461642,"bytes_used":84344832,"bytes_avail":64327581696,"bytes_total":64411926528,"write_bytes_sec":1465,"read_op_per_sec":0,"write_op_per_sec":4},"fsmap":{"epoch":5,"id":1,"up":1,"in":1,"max":1,"by_rank":[{"filesystem_id":1,"rank":0,"name":"cephfs.compute-0.hrlhzj","status":"up:active","gid":14254}],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs","restful"],"services":{}},"servicemap":{"epoch":5,"modified":"2025-12-01T09:15:37.039828+0000","services":{"mds":{"daemons":{"summary":"","cephfs.compute-0.hrlhzj":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}}}}}},"progress_events":{}}
Dec 01 09:15:38 compute-0 systemd[1]: libpod-c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43.scope: Deactivated successfully.
Dec 01 09:15:38 compute-0 podman[100175]: 2025-12-01 09:15:38.689247143 +0000 UTC m=+0.740436927 container died c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-3994441c15770f7bcac96802baa5b10446392f18724c28602e22f1b1d11c642c-merged.mount: Deactivated successfully.
Dec 01 09:15:38 compute-0 podman[100175]: 2025-12-01 09:15:38.756724444 +0000 UTC m=+0.807914218 container remove c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43 (image=quay.io/ceph/ceph:v18, name=confident_chatelet, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:15:38 compute-0 sudo[100149]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:38 compute-0 systemd[1]: libpod-conmon-c5f078f6c8318e72e94a11f4ff37aa16979675507aae6340644794f329b18d43.scope: Deactivated successfully.
Dec 01 09:15:38 compute-0 ceph-mon[75031]: pgmap v110: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s wr, 4 op/s
Dec 01 09:15:38 compute-0 ceph-mon[75031]: 2.12 deep-scrub starts
Dec 01 09:15:38 compute-0 ceph-mon[75031]: 2.12 deep-scrub ok
Dec 01 09:15:38 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3091290908' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:15:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v111: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s wr, 4 op/s
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]: {
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "osd_id": 0,
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "type": "bluestore"
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:     },
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "osd_id": 1,
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "type": "bluestore"
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:     },
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "osd_id": 2,
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:         "type": "bluestore"
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]:     }
Dec 01 09:15:39 compute-0 optimistic_perlman[100215]: }
Dec 01 09:15:39 compute-0 systemd[1]: libpod-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope: Deactivated successfully.
Dec 01 09:15:39 compute-0 podman[100195]: 2025-12-01 09:15:39.179333904 +0000 UTC m=+1.158263915 container died 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:15:39 compute-0 systemd[1]: libpod-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope: Consumed 1.023s CPU time.
Dec 01 09:15:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-0b5528a13e77310e2f767264d5587e5ec2e1c72d33e8c2f68b21550d5e61de23-merged.mount: Deactivated successfully.
Dec 01 09:15:39 compute-0 podman[100195]: 2025-12-01 09:15:39.229183486 +0000 UTC m=+1.208113497 container remove 057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_perlman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:39 compute-0 systemd[1]: libpod-conmon-057b0e4d6df4a0ca64f69ee1cb788e9eb6dd0f90a046f410062b065e03876d82.scope: Deactivated successfully.
Dec 01 09:15:39 compute-0 sudo[100048]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:15:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:15:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:39 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 4d4eb745-8d43-41d0-a7d8-61fd31f97c42 does not exist
Dec 01 09:15:39 compute-0 sudo[100294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:39 compute-0 sudo[100294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:39 compute-0 sudo[100294]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:39 compute-0 sudo[100319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:15:39 compute-0 sudo[100319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:39 compute-0 sudo[100319]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:39 compute-0 sudo[100344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:39 compute-0 sudo[100344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:39 compute-0 sudo[100344]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:39 compute-0 sudo[100369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:39 compute-0 sudo[100369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:39 compute-0 sudo[100369]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:39 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec 01 09:15:39 compute-0 sudo[100425]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkiueisaavlarjasqdadaotatpaqaweg ; /usr/bin/python3'
Dec 01 09:15:39 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec 01 09:15:39 compute-0 sudo[100425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:39 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec 01 09:15:39 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec 01 09:15:39 compute-0 sudo[100407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:39 compute-0 sudo[100407]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:39 compute-0 sudo[100407]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:39 compute-0 sudo[100445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:15:39 compute-0 sudo[100445]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:39 compute-0 python3[100442]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config dump -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:39 compute-0 podman[100470]: 2025-12-01 09:15:39.7288301 +0000 UTC m=+0.023925191 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:39 compute-0 podman[100470]: 2025-12-01 09:15:39.825399814 +0000 UTC m=+0.120494885 container create 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:39 compute-0 systemd[1]: Started libpod-conmon-79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a.scope.
Dec 01 09:15:39 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d0cb215a9f1f88f31a19feb60595f6feffcaae4286670be28f48e2a1861436/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d0cb215a9f1f88f31a19feb60595f6feffcaae4286670be28f48e2a1861436/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:39 compute-0 podman[100470]: 2025-12-01 09:15:39.910792514 +0000 UTC m=+0.205887605 container init 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:15:39 compute-0 podman[100470]: 2025-12-01 09:15:39.918649503 +0000 UTC m=+0.213744594 container start 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:39 compute-0 podman[100470]: 2025-12-01 09:15:39.921639458 +0000 UTC m=+0.216734559 container attach 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:39 compute-0 ceph-mon[75031]: 2.14 scrub starts
Dec 01 09:15:39 compute-0 ceph-mon[75031]: 2.14 scrub ok
Dec 01 09:15:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:39 compute-0 ceph-mon[75031]: 4.16 scrub starts
Dec 01 09:15:39 compute-0 ceph-mon[75031]: 3.13 scrub starts
Dec 01 09:15:39 compute-0 ceph-mon[75031]: 4.16 scrub ok
Dec 01 09:15:39 compute-0 ceph-mon[75031]: 3.13 scrub ok
Dec 01 09:15:40 compute-0 podman[100558]: 2025-12-01 09:15:40.198338258 +0000 UTC m=+0.051627999 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:15:40 compute-0 podman[100558]: 2025-12-01 09:15:40.31531337 +0000 UTC m=+0.168603091 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2956228905' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:15:40 compute-0 focused_meitner[100498]: 
Dec 01 09:15:40 compute-0 focused_meitner[100498]: [{"section":"global","name":"cluster_network","value":"172.20.0.0/24","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"global","name":"container_image","value":"quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0","level":"basic","can_update_at_runtime":false,"mask":""},{"section":"global","name":"log_to_file","value":"true","level":"basic","can_update_at_runtime":true,"mask":""},{"section":"global","name":"mon_cluster_log_to_file","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"ms_bind_ipv4","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"ms_bind_ipv6","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"osd_pool_default_size","value":"1","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"public_network","value":"192.168.122.0/24","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mon","name":"auth_allow_insecure_global_id_reclaim","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mon","name":"mon_warn_on_pool_no_redundancy","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mgr","name":"mgr/cephadm/container_init","value":"True","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/cephadm/migration_current","value":"6","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/cephadm/use_repo_digest","value":"false","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/orchestrator/orchestrator","value":"cephadm","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mgr","name":"mgr_standby_modules","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"osd","name":"osd_memory_target_autotune","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mds.cephfs","name":"mds_join_fs","value":"cephfs","level":"basic","can_update_at_runtime":true,"mask":""}]
Dec 01 09:15:40 compute-0 systemd[1]: libpod-79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a.scope: Deactivated successfully.
Dec 01 09:15:40 compute-0 podman[100470]: 2025-12-01 09:15:40.506269099 +0000 UTC m=+0.801364180 container died 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:15:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-50d0cb215a9f1f88f31a19feb60595f6feffcaae4286670be28f48e2a1861436-merged.mount: Deactivated successfully.
Dec 01 09:15:40 compute-0 podman[100470]: 2025-12-01 09:15:40.553423575 +0000 UTC m=+0.848518646 container remove 79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a (image=quay.io/ceph/ceph:v18, name=focused_meitner, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:40 compute-0 systemd[1]: libpod-conmon-79b33f5b706daf8d884bbf1fe8378db9df2e65a92bec96790f4d2a1908de7f9a.scope: Deactivated successfully.
Dec 01 09:15:40 compute-0 sudo[100425]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:40 compute-0 sudo[100445]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:40 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 33b134cc-0536-415d-802e-98a63cbf16eb does not exist
Dec 01 09:15:40 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 348d8149-25b0-4f42-825a-54ca2e37b828 does not exist
Dec 01 09:15:40 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 0da4a8eb-855c-41d9-b84d-512a601f067a does not exist
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:15:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:40 compute-0 sudo[100732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:40 compute-0 sudo[100732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:40 compute-0 sudo[100732]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:40 compute-0 sudo[100757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:40 compute-0 sudo[100757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:40 compute-0 sudo[100757]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:40 compute-0 ceph-mon[75031]: pgmap v111: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s wr, 4 op/s
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2956228905' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:15:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:15:41 compute-0 sudo[100782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:41 compute-0 sudo[100782]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:41 compute-0 sudo[100782]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v112: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 01 09:15:41 compute-0 sudo[100807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:15:41 compute-0 sudo[100807]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:41 compute-0 sudo[100902]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqoqofafkyrdxrifbnbebttsxzrzmpwc ; /usr/bin/python3'
Dec 01 09:15:41 compute-0 sudo[100902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:41 compute-0 podman[100890]: 2025-12-01 09:15:41.382623057 +0000 UTC m=+0.040450924 container create e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec 01 09:15:41 compute-0 systemd[1]: Started libpod-conmon-e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b.scope.
Dec 01 09:15:41 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:41 compute-0 podman[100890]: 2025-12-01 09:15:41.461019075 +0000 UTC m=+0.118846972 container init e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 01 09:15:41 compute-0 podman[100890]: 2025-12-01 09:15:41.364852544 +0000 UTC m=+0.022680441 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:41 compute-0 podman[100890]: 2025-12-01 09:15:41.466672754 +0000 UTC m=+0.124500621 container start e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:15:41 compute-0 stoic_kapitsa[100915]: 167 167
Dec 01 09:15:41 compute-0 podman[100890]: 2025-12-01 09:15:41.469741602 +0000 UTC m=+0.127569499 container attach e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:41 compute-0 systemd[1]: libpod-e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b.scope: Deactivated successfully.
Dec 01 09:15:41 compute-0 podman[100890]: 2025-12-01 09:15:41.474047689 +0000 UTC m=+0.131875566 container died e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:41 compute-0 python3[100909]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd get-require-min-compat-client _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-9c143e1b901c5bd69e14722c71a5a5a151ca74968e02dab646f3fb177d68e208-merged.mount: Deactivated successfully.
Dec 01 09:15:41 compute-0 podman[100890]: 2025-12-01 09:15:41.510420363 +0000 UTC m=+0.168248230 container remove e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_kapitsa, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:41 compute-0 systemd[1]: libpod-conmon-e9775f8c74c9ffa64825dd4dea841ff1136dade98259867036bc0ac833421b2b.scope: Deactivated successfully.
Dec 01 09:15:41 compute-0 podman[100927]: 2025-12-01 09:15:41.56137919 +0000 UTC m=+0.044228345 container create 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:41 compute-0 systemd[1]: Started libpod-conmon-92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73.scope.
Dec 01 09:15:41 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b72ad21ba273ee8d15db9d297e94666334948956b6b1fa51533a9559d4bd057/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b72ad21ba273ee8d15db9d297e94666334948956b6b1fa51533a9559d4bd057/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:41 compute-0 podman[100927]: 2025-12-01 09:15:41.627437686 +0000 UTC m=+0.110286871 container init 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:15:41 compute-0 podman[100927]: 2025-12-01 09:15:41.634066886 +0000 UTC m=+0.116916041 container start 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef)
Dec 01 09:15:41 compute-0 podman[100927]: 2025-12-01 09:15:41.544153863 +0000 UTC m=+0.027003038 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:41 compute-0 podman[100927]: 2025-12-01 09:15:41.63859272 +0000 UTC m=+0.121441875 container attach 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:15:41 compute-0 podman[100959]: 2025-12-01 09:15:41.686459539 +0000 UTC m=+0.047264921 container create c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:15:41 compute-0 systemd[1]: Started libpod-conmon-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope.
Dec 01 09:15:41 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:41 compute-0 podman[100959]: 2025-12-01 09:15:41.662964093 +0000 UTC m=+0.023769505 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:41 compute-0 podman[100959]: 2025-12-01 09:15:41.791737189 +0000 UTC m=+0.152542581 container init c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:15:41 compute-0 podman[100959]: 2025-12-01 09:15:41.799100323 +0000 UTC m=+0.159905705 container start c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:15:41 compute-0 podman[100959]: 2025-12-01 09:15:41.802561343 +0000 UTC m=+0.163366745 container attach c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd get-require-min-compat-client"} v 0) v1
Dec 01 09:15:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/951861402' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Dec 01 09:15:42 compute-0 romantic_black[100951]: mimic
Dec 01 09:15:42 compute-0 systemd[1]: libpod-92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73.scope: Deactivated successfully.
Dec 01 09:15:42 compute-0 podman[101003]: 2025-12-01 09:15:42.274907871 +0000 UTC m=+0.025274753 container died 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:15:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-2b72ad21ba273ee8d15db9d297e94666334948956b6b1fa51533a9559d4bd057-merged.mount: Deactivated successfully.
Dec 01 09:15:42 compute-0 podman[101003]: 2025-12-01 09:15:42.330668331 +0000 UTC m=+0.081035183 container remove 92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73 (image=quay.io/ceph/ceph:v18, name=romantic_black, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:15:42 compute-0 systemd[1]: libpod-conmon-92cb1d133a5e7a509eb2ebf8cdda5505b74348e48d503bb7f36aba0c3b82be73.scope: Deactivated successfully.
Dec 01 09:15:42 compute-0 sudo[100902]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:42 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Dec 01 09:15:42 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Dec 01 09:15:42 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec 01 09:15:42 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec 01 09:15:42 compute-0 interesting_lamarr[100977]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:15:42 compute-0 interesting_lamarr[100977]: --> relative data size: 1.0
Dec 01 09:15:42 compute-0 interesting_lamarr[100977]: --> All data devices are unavailable
Dec 01 09:15:42 compute-0 systemd[1]: libpod-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope: Deactivated successfully.
Dec 01 09:15:42 compute-0 systemd[1]: libpod-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope: Consumed 1.067s CPU time.
Dec 01 09:15:42 compute-0 podman[100959]: 2025-12-01 09:15:42.939005184 +0000 UTC m=+1.299810576 container died c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:15:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-4fee8141e725dd613c372bb53a33b48d6f1a9b3996ca06e0ed53d8fe860a9000-merged.mount: Deactivated successfully.
Dec 01 09:15:42 compute-0 ceph-mon[75031]: pgmap v112: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 01 09:15:42 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/951861402' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Dec 01 09:15:42 compute-0 ceph-mon[75031]: 4.17 scrub starts
Dec 01 09:15:42 compute-0 ceph-mon[75031]: 4.17 scrub ok
Dec 01 09:15:43 compute-0 podman[100959]: 2025-12-01 09:15:43.001354642 +0000 UTC m=+1.362160074 container remove c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_lamarr, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:15:43 compute-0 systemd[1]: libpod-conmon-c1afe52bb80e94e4e5d11f87ede507df8b9370cbb5fd25df783d4557f83bc2f9.scope: Deactivated successfully.
Dec 01 09:15:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:15:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:15:43 compute-0 sudo[100807]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v113: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 01 09:15:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:15:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:15:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:15:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:15:43 compute-0 sudo[101055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:43 compute-0 sudo[101055]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:43 compute-0 sudo[101055]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:43 compute-0 sudo[101110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvuvcxazgbpemynqvskmpkhtiiokhgtd ; /usr/bin/python3'
Dec 01 09:15:43 compute-0 sudo[101110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:15:43 compute-0 sudo[101095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:43 compute-0 sudo[101095]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:43 compute-0 sudo[101095]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:43 compute-0 sudo[101131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:43 compute-0 sudo[101131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:43 compute-0 sudo[101131]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:43 compute-0 sudo[101156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:15:43 compute-0 sudo[101156]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:43 compute-0 python3[101127]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v18 --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   versions -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:15:43 compute-0 podman[101181]: 2025-12-01 09:15:43.360285311 +0000 UTC m=+0.050261976 container create d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:15:43 compute-0 systemd[1]: Started libpod-conmon-d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2.scope.
Dec 01 09:15:43 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:43 compute-0 podman[101181]: 2025-12-01 09:15:43.335739552 +0000 UTC m=+0.025716237 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph:v18
Dec 01 09:15:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ee2b9c64821ba316465b4e7d4ba7fa9e75fe7d0d1acb6d33683fb83290c3149/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ee2b9c64821ba316465b4e7d4ba7fa9e75fe7d0d1acb6d33683fb83290c3149/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:43 compute-0 podman[101181]: 2025-12-01 09:15:43.445774684 +0000 UTC m=+0.135751379 container init d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:15:43 compute-0 podman[101181]: 2025-12-01 09:15:43.453017894 +0000 UTC m=+0.142994569 container start d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:43 compute-0 podman[101181]: 2025-12-01 09:15:43.457747504 +0000 UTC m=+0.147724189 container attach d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:43 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec 01 09:15:43 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec 01 09:15:43 compute-0 podman[101240]: 2025-12-01 09:15:43.62779756 +0000 UTC m=+0.037427239 container create c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:15:43 compute-0 systemd[1]: Started libpod-conmon-c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5.scope.
Dec 01 09:15:43 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:43 compute-0 podman[101240]: 2025-12-01 09:15:43.611637087 +0000 UTC m=+0.021266786 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:43 compute-0 podman[101240]: 2025-12-01 09:15:43.712984353 +0000 UTC m=+0.122614072 container init c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:15:43 compute-0 podman[101240]: 2025-12-01 09:15:43.72264507 +0000 UTC m=+0.132274759 container start c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:43 compute-0 intelligent_roentgen[101256]: 167 167
Dec 01 09:15:43 compute-0 podman[101240]: 2025-12-01 09:15:43.726386418 +0000 UTC m=+0.136016137 container attach c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:15:43 compute-0 systemd[1]: libpod-c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5.scope: Deactivated successfully.
Dec 01 09:15:43 compute-0 podman[101240]: 2025-12-01 09:15:43.727632468 +0000 UTC m=+0.137262167 container died c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:15:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-6f7388fa5e5883bbf2d80d7669fc54258c873712ed6e250f5d09dd17d663eafe-merged.mount: Deactivated successfully.
Dec 01 09:15:43 compute-0 podman[101240]: 2025-12-01 09:15:43.768935249 +0000 UTC m=+0.178564928 container remove c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_roentgen, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:43 compute-0 systemd[1]: libpod-conmon-c79f41c92773381d2c9d76139e11fd7b43194f04a021775b6593529880258da5.scope: Deactivated successfully.
Dec 01 09:15:43 compute-0 podman[101299]: 2025-12-01 09:15:43.924568197 +0000 UTC m=+0.041066464 container create 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:15:43 compute-0 systemd[1]: Started libpod-conmon-18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6.scope.
Dec 01 09:15:43 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:43 compute-0 ceph-mon[75031]: 2.1a scrub starts
Dec 01 09:15:43 compute-0 ceph-mon[75031]: 2.1a scrub ok
Dec 01 09:15:43 compute-0 ceph-mon[75031]: pgmap v113: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 01 09:15:43 compute-0 ceph-mon[75031]: 3.14 scrub starts
Dec 01 09:15:43 compute-0 ceph-mon[75031]: 3.14 scrub ok
Dec 01 09:15:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:43 compute-0 podman[101299]: 2025-12-01 09:15:43.998145582 +0000 UTC m=+0.114643889 container init 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:15:44 compute-0 podman[101299]: 2025-12-01 09:15:43.90764011 +0000 UTC m=+0.024138397 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:44 compute-0 podman[101299]: 2025-12-01 09:15:44.005533226 +0000 UTC m=+0.122031493 container start 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:15:44 compute-0 podman[101299]: 2025-12-01 09:15:44.009945216 +0000 UTC m=+0.126443483 container attach 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 01 09:15:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "versions", "format": "json"} v 0) v1
Dec 01 09:15:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1401116377' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Dec 01 09:15:44 compute-0 upbeat_bohr[101196]: 
Dec 01 09:15:44 compute-0 systemd[1]: libpod-d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2.scope: Deactivated successfully.
Dec 01 09:15:44 compute-0 upbeat_bohr[101196]: {"mon":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":1},"mgr":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":1},"osd":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":3},"mds":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":1},"overall":{"ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable)":6}}
Dec 01 09:15:44 compute-0 podman[101181]: 2025-12-01 09:15:44.106214801 +0000 UTC m=+0.796191466 container died d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:15:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-0ee2b9c64821ba316465b4e7d4ba7fa9e75fe7d0d1acb6d33683fb83290c3149-merged.mount: Deactivated successfully.
Dec 01 09:15:44 compute-0 podman[101181]: 2025-12-01 09:15:44.150734544 +0000 UTC m=+0.840711209 container remove d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2 (image=quay.io/ceph/ceph:v18, name=upbeat_bohr, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:44 compute-0 systemd[1]: libpod-conmon-d8f4033eb59cb62445a4b8580f5673040f519f37649e4388d52c974ccdfa86b2.scope: Deactivated successfully.
Dec 01 09:15:44 compute-0 sudo[101110]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:44 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Dec 01 09:15:44 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Dec 01 09:15:44 compute-0 admiring_williams[101316]: {
Dec 01 09:15:44 compute-0 admiring_williams[101316]:     "0": [
Dec 01 09:15:44 compute-0 admiring_williams[101316]:         {
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "devices": [
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "/dev/loop3"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             ],
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_name": "ceph_lv0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_size": "21470642176",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "name": "ceph_lv0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "tags": {
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.crush_device_class": "",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.encrypted": "0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osd_id": "0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.type": "block",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.vdo": "0"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             },
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "type": "block",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "vg_name": "ceph_vg0"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:         }
Dec 01 09:15:44 compute-0 admiring_williams[101316]:     ],
Dec 01 09:15:44 compute-0 admiring_williams[101316]:     "1": [
Dec 01 09:15:44 compute-0 admiring_williams[101316]:         {
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "devices": [
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "/dev/loop4"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             ],
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_name": "ceph_lv1",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_size": "21470642176",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "name": "ceph_lv1",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "tags": {
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.crush_device_class": "",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.encrypted": "0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osd_id": "1",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.type": "block",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.vdo": "0"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             },
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "type": "block",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "vg_name": "ceph_vg1"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:         }
Dec 01 09:15:44 compute-0 admiring_williams[101316]:     ],
Dec 01 09:15:44 compute-0 admiring_williams[101316]:     "2": [
Dec 01 09:15:44 compute-0 admiring_williams[101316]:         {
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "devices": [
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "/dev/loop5"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             ],
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_name": "ceph_lv2",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_size": "21470642176",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "name": "ceph_lv2",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "tags": {
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.cluster_name": "ceph",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.crush_device_class": "",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.encrypted": "0",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osd_id": "2",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.type": "block",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:                 "ceph.vdo": "0"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             },
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "type": "block",
Dec 01 09:15:44 compute-0 admiring_williams[101316]:             "vg_name": "ceph_vg2"
Dec 01 09:15:44 compute-0 admiring_williams[101316]:         }
Dec 01 09:15:44 compute-0 admiring_williams[101316]:     ]
Dec 01 09:15:44 compute-0 admiring_williams[101316]: }
Dec 01 09:15:44 compute-0 systemd[1]: libpod-18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6.scope: Deactivated successfully.
Dec 01 09:15:44 compute-0 podman[101339]: 2025-12-01 09:15:44.8543031 +0000 UTC m=+0.023410624 container died 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:15:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-49daba2b0a33257a9368df698cc66aeafeb40360dcc9dda283632220b4353201-merged.mount: Deactivated successfully.
Dec 01 09:15:44 compute-0 podman[101339]: 2025-12-01 09:15:44.90220911 +0000 UTC m=+0.071316624 container remove 18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_williams, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:15:44 compute-0 systemd[1]: libpod-conmon-18952a467fb5b53397715dc2359ae3160e306fc4e665de3c40034c10cfa571e6.scope: Deactivated successfully.
Dec 01 09:15:44 compute-0 sudo[101156]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1401116377' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Dec 01 09:15:44 compute-0 sudo[101354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:44 compute-0 sudo[101354]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:44 compute-0 sudo[101354]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v114: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s wr, 2 op/s
Dec 01 09:15:45 compute-0 sudo[101379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:15:45 compute-0 sudo[101379]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:45 compute-0 sudo[101379]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:45 compute-0 sudo[101404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:45 compute-0 sudo[101404]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:45 compute-0 sudo[101404]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:45 compute-0 sudo[101429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:15:45 compute-0 sudo[101429]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:45 compute-0 podman[101493]: 2025-12-01 09:15:45.473951742 +0000 UTC m=+0.039387470 container create 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:15:45 compute-0 systemd[1]: Started libpod-conmon-368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6.scope.
Dec 01 09:15:45 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:45 compute-0 podman[101493]: 2025-12-01 09:15:45.549039415 +0000 UTC m=+0.114475163 container init 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:15:45 compute-0 podman[101493]: 2025-12-01 09:15:45.456899011 +0000 UTC m=+0.022334759 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:45 compute-0 podman[101493]: 2025-12-01 09:15:45.55486616 +0000 UTC m=+0.120301888 container start 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec 01 09:15:45 compute-0 podman[101493]: 2025-12-01 09:15:45.55834557 +0000 UTC m=+0.123781298 container attach 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:15:45 compute-0 priceless_chatelet[101509]: 167 167
Dec 01 09:15:45 compute-0 systemd[1]: libpod-368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6.scope: Deactivated successfully.
Dec 01 09:15:45 compute-0 podman[101493]: 2025-12-01 09:15:45.559969512 +0000 UTC m=+0.125405240 container died 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:15:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf16fc7b86721153131d516e223146f8627eeda081d424595ee8eb12ab141250-merged.mount: Deactivated successfully.
Dec 01 09:15:45 compute-0 podman[101493]: 2025-12-01 09:15:45.594512398 +0000 UTC m=+0.159948126 container remove 368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:15:45 compute-0 systemd[1]: libpod-conmon-368677ba68969dd02ac3cdd6fb2e00062e861df355823ca53536812bd89e33e6.scope: Deactivated successfully.
Dec 01 09:15:45 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Dec 01 09:15:45 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Dec 01 09:15:45 compute-0 podman[101532]: 2025-12-01 09:15:45.748621098 +0000 UTC m=+0.040105154 container create 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:15:45 compute-0 systemd[1]: Started libpod-conmon-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope.
Dec 01 09:15:45 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:15:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:15:45 compute-0 podman[101532]: 2025-12-01 09:15:45.827657066 +0000 UTC m=+0.119141142 container init 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:15:45 compute-0 podman[101532]: 2025-12-01 09:15:45.732173236 +0000 UTC m=+0.023657312 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:15:45 compute-0 podman[101532]: 2025-12-01 09:15:45.837806638 +0000 UTC m=+0.129290694 container start 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:15:45 compute-0 podman[101532]: 2025-12-01 09:15:45.840816084 +0000 UTC m=+0.132300140 container attach 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:15:45 compute-0 ceph-mon[75031]: 2.1e scrub starts
Dec 01 09:15:45 compute-0 ceph-mon[75031]: 2.1e scrub ok
Dec 01 09:15:45 compute-0 ceph-mon[75031]: pgmap v114: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s wr, 2 op/s
Dec 01 09:15:45 compute-0 ceph-mon[75031]: 4.19 scrub starts
Dec 01 09:15:45 compute-0 ceph-mon[75031]: 4.19 scrub ok
Dec 01 09:15:46 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec 01 09:15:46 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec 01 09:15:46 compute-0 strange_hugle[101549]: {
Dec 01 09:15:46 compute-0 strange_hugle[101549]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "osd_id": 0,
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "type": "bluestore"
Dec 01 09:15:46 compute-0 strange_hugle[101549]:     },
Dec 01 09:15:46 compute-0 strange_hugle[101549]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "osd_id": 1,
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "type": "bluestore"
Dec 01 09:15:46 compute-0 strange_hugle[101549]:     },
Dec 01 09:15:46 compute-0 strange_hugle[101549]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "osd_id": 2,
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:15:46 compute-0 strange_hugle[101549]:         "type": "bluestore"
Dec 01 09:15:46 compute-0 strange_hugle[101549]:     }
Dec 01 09:15:46 compute-0 strange_hugle[101549]: }
Dec 01 09:15:46 compute-0 systemd[1]: libpod-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope: Deactivated successfully.
Dec 01 09:15:46 compute-0 podman[101532]: 2025-12-01 09:15:46.834482343 +0000 UTC m=+1.125966399 container died 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:15:46 compute-0 systemd[1]: libpod-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope: Consumed 1.004s CPU time.
Dec 01 09:15:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-03edf129755e443dd4d0fe98c37c00e242cbc121c8c5a7856402ec876cb233bb-merged.mount: Deactivated successfully.
Dec 01 09:15:46 compute-0 podman[101532]: 2025-12-01 09:15:46.892655539 +0000 UTC m=+1.184139595 container remove 86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_hugle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:15:46 compute-0 systemd[1]: libpod-conmon-86a69249d529060859e51b75826d4fcc848f8cc2a499ca58fb1fbc73a008c957.scope: Deactivated successfully.
Dec 01 09:15:46 compute-0 sudo[101429]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:15:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:15:46 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:46 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 9a0b826e-5c61-49b1-9445-7eec89f0ba24 does not exist
Dec 01 09:15:47 compute-0 ceph-mon[75031]: 4.1d scrub starts
Dec 01 09:15:47 compute-0 ceph-mon[75031]: 4.1d scrub ok
Dec 01 09:15:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:47 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:15:47 compute-0 sudo[101596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:15:47 compute-0 sudo[101596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:47 compute-0 sudo[101596]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v115: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s wr, 2 op/s
Dec 01 09:15:47 compute-0 sudo[101621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:15:47 compute-0 sudo[101621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:15:47 compute-0 sudo[101621]: pam_unix(sudo:session): session closed for user root
Dec 01 09:15:47 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec 01 09:15:47 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec 01 09:15:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:48 compute-0 ceph-mon[75031]: pgmap v115: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s wr, 2 op/s
Dec 01 09:15:49 compute-0 ceph-mon[75031]: 5.6 scrub starts
Dec 01 09:15:49 compute-0 ceph-mon[75031]: 5.6 scrub ok
Dec 01 09:15:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v116: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s wr, 0 op/s
Dec 01 09:15:49 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 01 09:15:49 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec 01 09:15:49 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec 01 09:15:49 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec 01 09:15:50 compute-0 ceph-mon[75031]: pgmap v116: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s wr, 0 op/s
Dec 01 09:15:50 compute-0 ceph-mon[75031]: 3.19 scrub starts
Dec 01 09:15:50 compute-0 ceph-mon[75031]: 3.19 scrub ok
Dec 01 09:15:50 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec 01 09:15:50 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec 01 09:15:51 compute-0 ceph-mon[75031]: 5.8 scrub starts
Dec 01 09:15:51 compute-0 ceph-mon[75031]: 5.8 scrub ok
Dec 01 09:15:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v117: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:51 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec 01 09:15:51 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec 01 09:15:52 compute-0 ceph-mon[75031]: 5.a scrub starts
Dec 01 09:15:52 compute-0 ceph-mon[75031]: 5.a scrub ok
Dec 01 09:15:52 compute-0 ceph-mon[75031]: pgmap v117: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:52 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec 01 09:15:52 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec 01 09:15:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v118: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:53 compute-0 ceph-mon[75031]: 5.b scrub starts
Dec 01 09:15:53 compute-0 ceph-mon[75031]: 5.b scrub ok
Dec 01 09:15:53 compute-0 ceph-mon[75031]: 4.1e scrub starts
Dec 01 09:15:53 compute-0 ceph-mon[75031]: 4.1e scrub ok
Dec 01 09:15:54 compute-0 ceph-mon[75031]: pgmap v118: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:54 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec 01 09:15:54 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec 01 09:15:54 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec 01 09:15:54 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec 01 09:15:54 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Dec 01 09:15:54 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Dec 01 09:15:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v119: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:55 compute-0 ceph-mon[75031]: 3.1a scrub starts
Dec 01 09:15:55 compute-0 ceph-mon[75031]: 3.1a scrub ok
Dec 01 09:15:55 compute-0 ceph-mon[75031]: 4.1f scrub starts
Dec 01 09:15:55 compute-0 ceph-mon[75031]: 4.1f scrub ok
Dec 01 09:15:55 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Dec 01 09:15:55 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Dec 01 09:15:55 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Dec 01 09:15:55 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Dec 01 09:15:56 compute-0 ceph-mon[75031]: 5.d scrub starts
Dec 01 09:15:56 compute-0 ceph-mon[75031]: 5.d scrub ok
Dec 01 09:15:56 compute-0 ceph-mon[75031]: pgmap v119: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:56 compute-0 ceph-mon[75031]: 3.1c scrub starts
Dec 01 09:15:56 compute-0 ceph-mon[75031]: 3.1c scrub ok
Dec 01 09:15:56 compute-0 ceph-mon[75031]: 6.3 scrub starts
Dec 01 09:15:56 compute-0 ceph-mon[75031]: 6.3 scrub ok
Dec 01 09:15:56 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec 01 09:15:56 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec 01 09:15:56 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Dec 01 09:15:56 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Dec 01 09:15:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v120: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:57 compute-0 ceph-mon[75031]: 6.5 scrub starts
Dec 01 09:15:57 compute-0 ceph-mon[75031]: 6.5 scrub ok
Dec 01 09:15:57 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Dec 01 09:15:57 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Dec 01 09:15:57 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Dec 01 09:15:57 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Dec 01 09:15:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:15:58 compute-0 ceph-mon[75031]: 5.e scrub starts
Dec 01 09:15:58 compute-0 ceph-mon[75031]: 5.e scrub ok
Dec 01 09:15:58 compute-0 ceph-mon[75031]: pgmap v120: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:15:58 compute-0 ceph-mon[75031]: 7.7 scrub starts
Dec 01 09:15:58 compute-0 ceph-mon[75031]: 7.7 scrub ok
Dec 01 09:15:58 compute-0 ceph-mon[75031]: 6.7 scrub starts
Dec 01 09:15:58 compute-0 ceph-mon[75031]: 6.7 scrub ok
Dec 01 09:15:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v121: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:00 compute-0 ceph-mon[75031]: pgmap v121: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:00 compute-0 sshd-session[101646]: Accepted publickey for zuul from 192.168.122.30 port 59902 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:16:00 compute-0 systemd-logind[788]: New session 34 of user zuul.
Dec 01 09:16:00 compute-0 systemd[1]: Started Session 34 of User zuul.
Dec 01 09:16:00 compute-0 sshd-session[101646]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:16:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v122: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:01 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec 01 09:16:01 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec 01 09:16:01 compute-0 python3.9[101799]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:16:01 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 01 09:16:01 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 01 09:16:02 compute-0 ceph-mon[75031]: pgmap v122: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:02 compute-0 ceph-mon[75031]: 7.b scrub starts
Dec 01 09:16:02 compute-0 ceph-mon[75031]: 7.b scrub ok
Dec 01 09:16:02 compute-0 ceph-mon[75031]: 6.9 scrub starts
Dec 01 09:16:02 compute-0 ceph-mon[75031]: 6.9 scrub ok
Dec 01 09:16:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v123: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:03 compute-0 sudo[102015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxgukpxasijmtgyaoznzqhpyoqopqkuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580562.5554416-32-167362541230984/AnsiballZ_command.py'
Dec 01 09:16:03 compute-0 sudo[102015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:16:03 compute-0 python3.9[102017]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                             pushd /var/tmp
                                             curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                             pushd repo-setup-main
                                             python3 -m venv ./venv
                                             PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                             ./venv/bin/repo-setup current-podified -b antelope
                                             popd
                                             rm -rf repo-setup-main
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:16:04 compute-0 ceph-mon[75031]: pgmap v123: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:04 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.a scrub starts
Dec 01 09:16:04 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.a scrub ok
Dec 01 09:16:04 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub starts
Dec 01 09:16:04 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub ok
Dec 01 09:16:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v124: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:05 compute-0 ceph-mon[75031]: 6.a scrub starts
Dec 01 09:16:05 compute-0 ceph-mon[75031]: 6.a scrub ok
Dec 01 09:16:05 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Dec 01 09:16:05 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Dec 01 09:16:06 compute-0 ceph-mon[75031]: 5.10 deep-scrub starts
Dec 01 09:16:06 compute-0 ceph-mon[75031]: 5.10 deep-scrub ok
Dec 01 09:16:06 compute-0 ceph-mon[75031]: pgmap v124: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:06 compute-0 ceph-mon[75031]: 6.10 scrub starts
Dec 01 09:16:06 compute-0 ceph-mon[75031]: 6.10 scrub ok
Dec 01 09:16:06 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub starts
Dec 01 09:16:06 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub ok
Dec 01 09:16:06 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec 01 09:16:06 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec 01 09:16:06 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Dec 01 09:16:06 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Dec 01 09:16:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v125: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:07 compute-0 ceph-mon[75031]: 7.d scrub starts
Dec 01 09:16:07 compute-0 ceph-mon[75031]: 7.d scrub ok
Dec 01 09:16:07 compute-0 ceph-mon[75031]: 6.12 scrub starts
Dec 01 09:16:07 compute-0 ceph-mon[75031]: 6.12 scrub ok
Dec 01 09:16:07 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:08 compute-0 ceph-mon[75031]: 5.17 scrub starts
Dec 01 09:16:08 compute-0 ceph-mon[75031]: 5.17 scrub ok
Dec 01 09:16:08 compute-0 ceph-mon[75031]: pgmap v125: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v126: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:09 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub starts
Dec 01 09:16:09 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub ok
Dec 01 09:16:09 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Dec 01 09:16:09 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Dec 01 09:16:10 compute-0 ceph-mon[75031]: pgmap v126: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:10 compute-0 ceph-mon[75031]: 7.10 deep-scrub starts
Dec 01 09:16:10 compute-0 ceph-mon[75031]: 7.10 deep-scrub ok
Dec 01 09:16:10 compute-0 sudo[102015]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v127: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:11 compute-0 sshd-session[101649]: Connection closed by 192.168.122.30 port 59902
Dec 01 09:16:11 compute-0 sshd-session[101646]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:16:11 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Dec 01 09:16:11 compute-0 systemd[1]: session-34.scope: Consumed 8.800s CPU time.
Dec 01 09:16:11 compute-0 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Dec 01 09:16:11 compute-0 systemd-logind[788]: Removed session 34.
Dec 01 09:16:11 compute-0 ceph-mon[75031]: 5.1b scrub starts
Dec 01 09:16:11 compute-0 ceph-mon[75031]: 5.1b scrub ok
Dec 01 09:16:12 compute-0 ceph-mon[75031]: pgmap v127: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:12 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Dec 01 09:16:12 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Dec 01 09:16:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:16:12
Dec 01 09:16:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:16:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:16:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'cephfs.cephfs.data', 'images', 'backups', 'vms', '.mgr', 'volumes']
Dec 01 09:16:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v128: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:16:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:16:13 compute-0 ceph-mon[75031]: 6.16 scrub starts
Dec 01 09:16:13 compute-0 ceph-mon[75031]: 6.16 scrub ok
Dec 01 09:16:13 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Dec 01 09:16:13 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Dec 01 09:16:14 compute-0 ceph-mon[75031]: pgmap v128: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:14 compute-0 ceph-mon[75031]: 6.18 scrub starts
Dec 01 09:16:14 compute-0 ceph-mon[75031]: 6.18 scrub ok
Dec 01 09:16:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v129: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:15 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub starts
Dec 01 09:16:15 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub ok
Dec 01 09:16:15 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Dec 01 09:16:15 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Dec 01 09:16:16 compute-0 ceph-mon[75031]: pgmap v129: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:16 compute-0 ceph-mon[75031]: 7.12 deep-scrub starts
Dec 01 09:16:16 compute-0 ceph-mon[75031]: 7.12 deep-scrub ok
Dec 01 09:16:16 compute-0 ceph-mon[75031]: 6.19 scrub starts
Dec 01 09:16:16 compute-0 ceph-mon[75031]: 6.19 scrub ok
Dec 01 09:16:16 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Dec 01 09:16:16 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Dec 01 09:16:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v130: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:17 compute-0 ceph-mon[75031]: 5.1c scrub starts
Dec 01 09:16:17 compute-0 ceph-mon[75031]: 5.1c scrub ok
Dec 01 09:16:17 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1a deep-scrub starts
Dec 01 09:16:17 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec 01 09:16:17 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1a deep-scrub ok
Dec 01 09:16:17 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec 01 09:16:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:16:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:16:18 compute-0 ceph-mon[75031]: pgmap v130: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:18 compute-0 ceph-mon[75031]: 6.1a deep-scrub starts
Dec 01 09:16:18 compute-0 ceph-mon[75031]: 5.1f scrub starts
Dec 01 09:16:18 compute-0 ceph-mon[75031]: 6.1a deep-scrub ok
Dec 01 09:16:18 compute-0 ceph-mon[75031]: 5.1f scrub ok
Dec 01 09:16:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v131: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:19 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec 01 09:16:19 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec 01 09:16:20 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec 01 09:16:20 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec 01 09:16:20 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Dec 01 09:16:20 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Dec 01 09:16:20 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec 01 09:16:20 compute-0 ceph-mon[75031]: pgmap v131: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:20 compute-0 ceph-mon[75031]: 7.14 scrub starts
Dec 01 09:16:20 compute-0 ceph-mon[75031]: 7.14 scrub ok
Dec 01 09:16:20 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec 01 09:16:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v132: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:21 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 01 09:16:21 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 01 09:16:21 compute-0 ceph-mon[75031]: 7.16 scrub starts
Dec 01 09:16:21 compute-0 ceph-mon[75031]: 7.16 scrub ok
Dec 01 09:16:21 compute-0 ceph-mon[75031]: 6.1b scrub starts
Dec 01 09:16:21 compute-0 ceph-mon[75031]: 6.1b scrub ok
Dec 01 09:16:21 compute-0 ceph-mon[75031]: 4.18 scrub starts
Dec 01 09:16:21 compute-0 ceph-mon[75031]: 4.18 scrub ok
Dec 01 09:16:22 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 01 09:16:22 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 01 09:16:22 compute-0 ceph-mon[75031]: pgmap v132: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:22 compute-0 ceph-mon[75031]: 5.1e scrub starts
Dec 01 09:16:22 compute-0 ceph-mon[75031]: 5.1e scrub ok
Dec 01 09:16:22 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec 01 09:16:22 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec 01 09:16:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v133: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:23 compute-0 ceph-mon[75031]: 2.19 scrub starts
Dec 01 09:16:23 compute-0 ceph-mon[75031]: 2.19 scrub ok
Dec 01 09:16:23 compute-0 ceph-mon[75031]: 4.1b scrub starts
Dec 01 09:16:23 compute-0 ceph-mon[75031]: 4.1b scrub ok
Dec 01 09:16:24 compute-0 ceph-mon[75031]: pgmap v133: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:24 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec 01 09:16:24 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec 01 09:16:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v134: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:25 compute-0 ceph-mon[75031]: 4.1a scrub starts
Dec 01 09:16:25 compute-0 ceph-mon[75031]: 4.1a scrub ok
Dec 01 09:16:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Dec 01 09:16:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Dec 01 09:16:26 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 01 09:16:26 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec 01 09:16:26 compute-0 ceph-mon[75031]: pgmap v134: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:26 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 01 09:16:26 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 01 09:16:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v135: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:27 compute-0 sshd-session[102074]: Accepted publickey for zuul from 192.168.122.30 port 34252 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:16:27 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Dec 01 09:16:27 compute-0 systemd-logind[788]: New session 35 of user zuul.
Dec 01 09:16:27 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Dec 01 09:16:27 compute-0 systemd[1]: Started Session 35 of User zuul.
Dec 01 09:16:27 compute-0 sshd-session[102074]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:16:27 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub starts
Dec 01 09:16:27 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub ok
Dec 01 09:16:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:28 compute-0 ceph-mon[75031]: 7.17 scrub starts
Dec 01 09:16:28 compute-0 ceph-mon[75031]: 7.17 scrub ok
Dec 01 09:16:28 compute-0 ceph-mon[75031]: 2.18 scrub starts
Dec 01 09:16:28 compute-0 ceph-mon[75031]: 2.18 scrub ok
Dec 01 09:16:28 compute-0 ceph-mon[75031]: 6.f scrub starts
Dec 01 09:16:28 compute-0 ceph-mon[75031]: 6.f scrub ok
Dec 01 09:16:28 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec 01 09:16:28 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec 01 09:16:28 compute-0 python3.9[102227]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 01 09:16:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v136: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:29 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec 01 09:16:29 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec 01 09:16:29 compute-0 ceph-mon[75031]: pgmap v135: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:29 compute-0 ceph-mon[75031]: 7.19 scrub starts
Dec 01 09:16:29 compute-0 ceph-mon[75031]: 7.19 scrub ok
Dec 01 09:16:29 compute-0 ceph-mon[75031]: 2.16 deep-scrub starts
Dec 01 09:16:29 compute-0 ceph-mon[75031]: 2.16 deep-scrub ok
Dec 01 09:16:29 compute-0 ceph-mon[75031]: 2.13 scrub starts
Dec 01 09:16:29 compute-0 ceph-mon[75031]: 2.13 scrub ok
Dec 01 09:16:30 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec 01 09:16:30 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec 01 09:16:30 compute-0 python3.9[102401]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:16:30 compute-0 ceph-mon[75031]: pgmap v136: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:30 compute-0 ceph-mon[75031]: 7.1d scrub starts
Dec 01 09:16:30 compute-0 ceph-mon[75031]: 7.1d scrub ok
Dec 01 09:16:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v137: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:31 compute-0 sudo[102555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udjcxphqlkypggepjikrorzmeeptqlnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580590.8937182-45-276556536576513/AnsiballZ_command.py'
Dec 01 09:16:31 compute-0 sudo[102555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:16:31 compute-0 ceph-mon[75031]: 7.1e scrub starts
Dec 01 09:16:31 compute-0 ceph-mon[75031]: 7.1e scrub ok
Dec 01 09:16:31 compute-0 python3.9[102557]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:16:31 compute-0 sudo[102555]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:32 compute-0 sudo[102708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prxvfnjcdxlrzderuljyokxmknpbhuop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580591.9464412-57-247767594775731/AnsiballZ_stat.py'
Dec 01 09:16:32 compute-0 sudo[102708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:16:32 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec 01 09:16:32 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec 01 09:16:32 compute-0 ceph-mon[75031]: pgmap v137: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:32 compute-0 python3.9[102710]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:16:32 compute-0 sudo[102708]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v138: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:33 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Dec 01 09:16:33 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Dec 01 09:16:33 compute-0 sudo[102862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebgrykrxfxszoimijvsdyctdblxucaya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580592.9542391-68-160227536759050/AnsiballZ_file.py'
Dec 01 09:16:33 compute-0 sudo[102862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:16:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:33 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub starts
Dec 01 09:16:33 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub ok
Dec 01 09:16:33 compute-0 ceph-mon[75031]: 5.14 scrub starts
Dec 01 09:16:33 compute-0 ceph-mon[75031]: 5.14 scrub ok
Dec 01 09:16:33 compute-0 python3.9[102864]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:16:33 compute-0 sudo[102862]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:33 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec 01 09:16:33 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec 01 09:16:34 compute-0 sudo[103014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arpuhgatdtierrloaqputmdhxkmgjoto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580593.8189175-77-12189062030969/AnsiballZ_file.py'
Dec 01 09:16:34 compute-0 sudo[103014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:16:34 compute-0 python3.9[103016]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:16:34 compute-0 sudo[103014]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:34 compute-0 ceph-mon[75031]: pgmap v138: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:34 compute-0 ceph-mon[75031]: 6.1e scrub starts
Dec 01 09:16:34 compute-0 ceph-mon[75031]: 6.1e scrub ok
Dec 01 09:16:34 compute-0 ceph-mon[75031]: 5.15 deep-scrub starts
Dec 01 09:16:34 compute-0 ceph-mon[75031]: 5.15 deep-scrub ok
Dec 01 09:16:34 compute-0 ceph-mon[75031]: 4.e scrub starts
Dec 01 09:16:34 compute-0 ceph-mon[75031]: 4.e scrub ok
Dec 01 09:16:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v139: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:35 compute-0 python3.9[103166]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:16:35 compute-0 network[103183]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:16:35 compute-0 network[103184]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:16:35 compute-0 network[103185]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:16:35 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec 01 09:16:35 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec 01 09:16:36 compute-0 ceph-mon[75031]: pgmap v139: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:36 compute-0 ceph-mon[75031]: 2.f scrub starts
Dec 01 09:16:36 compute-0 ceph-mon[75031]: 2.f scrub ok
Dec 01 09:16:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v140: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:37 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Dec 01 09:16:37 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Dec 01 09:16:38 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 01 09:16:38 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 01 09:16:38 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Dec 01 09:16:38 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Dec 01 09:16:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:38 compute-0 ceph-mon[75031]: pgmap v140: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:38 compute-0 ceph-mon[75031]: 5.18 scrub starts
Dec 01 09:16:38 compute-0 ceph-mon[75031]: 5.18 scrub ok
Dec 01 09:16:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v141: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:39 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 01 09:16:39 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 01 09:16:39 compute-0 ceph-mon[75031]: 5.19 scrub starts
Dec 01 09:16:39 compute-0 ceph-mon[75031]: 5.19 scrub ok
Dec 01 09:16:39 compute-0 ceph-mon[75031]: 5.7 scrub starts
Dec 01 09:16:39 compute-0 ceph-mon[75031]: 5.7 scrub ok
Dec 01 09:16:40 compute-0 python3.9[103445]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:16:40 compute-0 ceph-mon[75031]: pgmap v141: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:40 compute-0 ceph-mon[75031]: 5.4 scrub starts
Dec 01 09:16:40 compute-0 ceph-mon[75031]: 5.4 scrub ok
Dec 01 09:16:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v142: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:41 compute-0 python3.9[103595]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:16:41 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Dec 01 09:16:41 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Dec 01 09:16:42 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 01 09:16:42 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 01 09:16:42 compute-0 python3.9[103749]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:16:42 compute-0 ceph-mon[75031]: pgmap v142: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:42 compute-0 ceph-mon[75031]: 2.11 scrub starts
Dec 01 09:16:42 compute-0 ceph-mon[75031]: 2.11 scrub ok
Dec 01 09:16:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:16:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:16:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:16:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:16:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:16:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:16:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v143: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:43 compute-0 sudo[103905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhwehgejwvotdfnvmnzsniemghnhefyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580602.961916-125-99153550306525/AnsiballZ_setup.py'
Dec 01 09:16:43 compute-0 sudo[103905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:16:43 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 01 09:16:43 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 01 09:16:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:43 compute-0 python3.9[103907]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:16:43 compute-0 ceph-mon[75031]: 5.1a scrub starts
Dec 01 09:16:43 compute-0 ceph-mon[75031]: 5.1a scrub ok
Dec 01 09:16:43 compute-0 sudo[103905]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:44 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 01 09:16:44 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 01 09:16:44 compute-0 sudo[103989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyntzhxsbczzquvspmgascaraervbmti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580602.961916-125-99153550306525/AnsiballZ_dnf.py'
Dec 01 09:16:44 compute-0 sudo[103989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:16:44 compute-0 python3.9[103991]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:16:44 compute-0 ceph-mon[75031]: pgmap v143: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:44 compute-0 ceph-mon[75031]: 5.3 scrub starts
Dec 01 09:16:44 compute-0 ceph-mon[75031]: 5.3 scrub ok
Dec 01 09:16:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v144: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:45 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Dec 01 09:16:45 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Dec 01 09:16:45 compute-0 ceph-mon[75031]: 5.1d scrub starts
Dec 01 09:16:45 compute-0 ceph-mon[75031]: 5.1d scrub ok
Dec 01 09:16:46 compute-0 ceph-mon[75031]: pgmap v144: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:46 compute-0 ceph-mon[75031]: 2.8 scrub starts
Dec 01 09:16:46 compute-0 ceph-mon[75031]: 2.8 scrub ok
Dec 01 09:16:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v145: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:47 compute-0 sudo[104052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:47 compute-0 sudo[104052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:47 compute-0 sudo[104052]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:47 compute-0 sudo[104077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:16:47 compute-0 sudo[104077]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:47 compute-0 sudo[104077]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:47 compute-0 sudo[104102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:47 compute-0 sudo[104102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:47 compute-0 sudo[104102]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:47 compute-0 sudo[104130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:16:47 compute-0 sudo[104130]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:47 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec 01 09:16:47 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec 01 09:16:48 compute-0 ceph-mon[75031]: pgmap v145: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:48 compute-0 podman[104228]: 2025-12-01 09:16:48.242558863 +0000 UTC m=+0.478844759 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec 01 09:16:48 compute-0 podman[104228]: 2025-12-01 09:16:48.338715168 +0000 UTC m=+0.575001044 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:16:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:48 compute-0 sudo[104130]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:16:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:16:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:49 compute-0 sudo[104369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:49 compute-0 sudo[104369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:49 compute-0 sudo[104369]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v146: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:49 compute-0 sudo[104394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:16:49 compute-0 sudo[104394]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:49 compute-0 sudo[104394]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:49 compute-0 sudo[104419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:49 compute-0 sudo[104419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:49 compute-0 sudo[104419]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:49 compute-0 sudo[104444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:16:49 compute-0 sudo[104444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:49 compute-0 ceph-mon[75031]: 4.a scrub starts
Dec 01 09:16:49 compute-0 ceph-mon[75031]: 4.a scrub ok
Dec 01 09:16:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:49 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 01 09:16:49 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 01 09:16:49 compute-0 sudo[104444]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:16:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:16:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:16:49 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:16:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:16:49 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:49 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 2e9f07b2-5977-43a9-93f8-19beb8bb89ba does not exist
Dec 01 09:16:49 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 62e2a1ab-02f6-460e-85f3-0507d216486a does not exist
Dec 01 09:16:49 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 56b3bf1c-3b66-49d9-966c-fe45e73b3855 does not exist
Dec 01 09:16:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:16:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:16:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:16:49 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:16:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:16:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:16:49 compute-0 sudo[104501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:49 compute-0 sudo[104501]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:49 compute-0 sudo[104501]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:49 compute-0 sudo[104526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:16:49 compute-0 sudo[104526]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:49 compute-0 sudo[104526]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:49 compute-0 sudo[104552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:49 compute-0 sudo[104552]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:49 compute-0 sudo[104552]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:49 compute-0 sudo[104577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:16:49 compute-0 sudo[104577]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:50 compute-0 podman[104647]: 2025-12-01 09:16:50.136703191 +0000 UTC m=+0.038231348 container create fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:16:50 compute-0 systemd[1]: Started libpod-conmon-fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc.scope.
Dec 01 09:16:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:16:50 compute-0 ceph-mon[75031]: pgmap v146: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:50 compute-0 ceph-mon[75031]: 5.5 scrub starts
Dec 01 09:16:50 compute-0 ceph-mon[75031]: 5.5 scrub ok
Dec 01 09:16:50 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:16:50 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:16:50 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:50 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:16:50 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:16:50 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:16:50 compute-0 podman[104647]: 2025-12-01 09:16:50.120246403 +0000 UTC m=+0.021774580 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:16:50 compute-0 podman[104647]: 2025-12-01 09:16:50.221946856 +0000 UTC m=+0.123475033 container init fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:16:50 compute-0 podman[104647]: 2025-12-01 09:16:50.230040746 +0000 UTC m=+0.131568903 container start fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:16:50 compute-0 podman[104647]: 2025-12-01 09:16:50.233887915 +0000 UTC m=+0.135416102 container attach fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:16:50 compute-0 boring_chandrasekhar[104664]: 167 167
Dec 01 09:16:50 compute-0 systemd[1]: libpod-fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc.scope: Deactivated successfully.
Dec 01 09:16:50 compute-0 podman[104647]: 2025-12-01 09:16:50.23826111 +0000 UTC m=+0.139789267 container died fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:16:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-87e0525d1a5c733159722fb2a1db49d5bab1f5d77bdc7ecf59b145ce89bd2ac8-merged.mount: Deactivated successfully.
Dec 01 09:16:50 compute-0 podman[104647]: 2025-12-01 09:16:50.276222679 +0000 UTC m=+0.177750836 container remove fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_chandrasekhar, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:16:50 compute-0 systemd[1]: libpod-conmon-fb8aef7935b0ef187322587bafad29c881af6ff49dc95e9444c7422576bf0cdc.scope: Deactivated successfully.
Dec 01 09:16:50 compute-0 podman[104688]: 2025-12-01 09:16:50.446556463 +0000 UTC m=+0.062243421 container create 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:16:50 compute-0 systemd[1]: Started libpod-conmon-0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6.scope.
Dec 01 09:16:50 compute-0 podman[104688]: 2025-12-01 09:16:50.41656934 +0000 UTC m=+0.032256368 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:16:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:16:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:50 compute-0 podman[104688]: 2025-12-01 09:16:50.534410982 +0000 UTC m=+0.150097970 container init 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:16:50 compute-0 podman[104688]: 2025-12-01 09:16:50.541307208 +0000 UTC m=+0.156994126 container start 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:16:50 compute-0 podman[104688]: 2025-12-01 09:16:50.544731185 +0000 UTC m=+0.160418193 container attach 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:16:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v147: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:51 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub starts
Dec 01 09:16:51 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub ok
Dec 01 09:16:51 compute-0 silly_poitras[104705]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:16:51 compute-0 silly_poitras[104705]: --> relative data size: 1.0
Dec 01 09:16:51 compute-0 silly_poitras[104705]: --> All data devices are unavailable
Dec 01 09:16:51 compute-0 systemd[1]: libpod-0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6.scope: Deactivated successfully.
Dec 01 09:16:51 compute-0 podman[104688]: 2025-12-01 09:16:51.57802276 +0000 UTC m=+1.193709688 container died 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:16:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-3248251a3969ae7c03ae3d850d86ad608380a03b8474a128dcc5a3c731260f78-merged.mount: Deactivated successfully.
Dec 01 09:16:51 compute-0 podman[104688]: 2025-12-01 09:16:51.636356599 +0000 UTC m=+1.252043517 container remove 0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poitras, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:16:51 compute-0 systemd[1]: libpod-conmon-0c7195a7c43d75968569021f414daebea71978dfb77275a0d3c2d1ac47495bb6.scope: Deactivated successfully.
Dec 01 09:16:51 compute-0 sudo[104577]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:51 compute-0 sudo[104746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:51 compute-0 sudo[104746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:51 compute-0 sudo[104746]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:51 compute-0 sudo[104771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:16:51 compute-0 sudo[104771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:51 compute-0 sudo[104771]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:51 compute-0 sudo[104796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:51 compute-0 sudo[104796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:51 compute-0 sudo[104796]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:51 compute-0 sudo[104821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:16:51 compute-0 sudo[104821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:52 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub starts
Dec 01 09:16:52 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub ok
Dec 01 09:16:52 compute-0 ceph-mon[75031]: pgmap v147: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:52 compute-0 ceph-mon[75031]: 2.b scrub starts
Dec 01 09:16:52 compute-0 ceph-mon[75031]: 2.b scrub ok
Dec 01 09:16:52 compute-0 podman[104884]: 2025-12-01 09:16:52.253677815 +0000 UTC m=+0.046889015 container create 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:16:52 compute-0 systemd[1]: Started libpod-conmon-5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af.scope.
Dec 01 09:16:52 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:16:52 compute-0 podman[104884]: 2025-12-01 09:16:52.324763336 +0000 UTC m=+0.117974556 container init 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:16:52 compute-0 podman[104884]: 2025-12-01 09:16:52.330972883 +0000 UTC m=+0.124184083 container start 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:16:52 compute-0 podman[104884]: 2025-12-01 09:16:52.237100103 +0000 UTC m=+0.030311323 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:16:52 compute-0 adoring_ritchie[104899]: 167 167
Dec 01 09:16:52 compute-0 systemd[1]: libpod-5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af.scope: Deactivated successfully.
Dec 01 09:16:52 compute-0 podman[104884]: 2025-12-01 09:16:52.345685361 +0000 UTC m=+0.138896561 container attach 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 01 09:16:52 compute-0 podman[104884]: 2025-12-01 09:16:52.346362751 +0000 UTC m=+0.139573961 container died 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:16:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-ee324b259928c7cded8e9ffe8c6b09e8c106a0f3f9753a8654adf8fcf98e8fb2-merged.mount: Deactivated successfully.
Dec 01 09:16:52 compute-0 podman[104884]: 2025-12-01 09:16:52.380733488 +0000 UTC m=+0.173944688 container remove 5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_ritchie, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:16:52 compute-0 systemd[1]: libpod-conmon-5f18e84c7b5ab18832f89f10de7807c1d527ca2d7e11e32fccc02656db3c86af.scope: Deactivated successfully.
Dec 01 09:16:52 compute-0 podman[104922]: 2025-12-01 09:16:52.536042865 +0000 UTC m=+0.038082894 container create 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:16:52 compute-0 systemd[1]: Started libpod-conmon-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope.
Dec 01 09:16:52 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:16:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:52 compute-0 podman[104922]: 2025-12-01 09:16:52.595378772 +0000 UTC m=+0.097418891 container init 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:16:52 compute-0 podman[104922]: 2025-12-01 09:16:52.602563407 +0000 UTC m=+0.104603436 container start 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:16:52 compute-0 podman[104922]: 2025-12-01 09:16:52.605611123 +0000 UTC m=+0.107651172 container attach 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:16:52 compute-0 podman[104922]: 2025-12-01 09:16:52.520070191 +0000 UTC m=+0.022110250 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:16:52 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec 01 09:16:52 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec 01 09:16:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v148: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:53 compute-0 ceph-mon[75031]: 5.f deep-scrub starts
Dec 01 09:16:53 compute-0 ceph-mon[75031]: 5.f deep-scrub ok
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]: {
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:     "0": [
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:         {
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "devices": [
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "/dev/loop3"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             ],
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_name": "ceph_lv0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_size": "21470642176",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "name": "ceph_lv0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "tags": {
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cluster_name": "ceph",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.crush_device_class": "",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.encrypted": "0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osd_id": "0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.type": "block",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.vdo": "0"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             },
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "type": "block",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "vg_name": "ceph_vg0"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:         }
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:     ],
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:     "1": [
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:         {
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "devices": [
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "/dev/loop4"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             ],
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_name": "ceph_lv1",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_size": "21470642176",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "name": "ceph_lv1",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "tags": {
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cluster_name": "ceph",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.crush_device_class": "",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.encrypted": "0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osd_id": "1",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.type": "block",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.vdo": "0"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             },
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "type": "block",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "vg_name": "ceph_vg1"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:         }
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:     ],
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:     "2": [
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:         {
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "devices": [
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "/dev/loop5"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             ],
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_name": "ceph_lv2",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_size": "21470642176",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "name": "ceph_lv2",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "tags": {
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.cluster_name": "ceph",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.crush_device_class": "",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.encrypted": "0",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osd_id": "2",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.type": "block",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:                 "ceph.vdo": "0"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             },
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "type": "block",
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:             "vg_name": "ceph_vg2"
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:         }
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]:     ]
Dec 01 09:16:53 compute-0 trusting_mirzakhani[104938]: }
Dec 01 09:16:53 compute-0 systemd[1]: libpod-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope: Deactivated successfully.
Dec 01 09:16:53 compute-0 conmon[104938]: conmon 55d006748401b1455d62 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope/container/memory.events
Dec 01 09:16:53 compute-0 podman[104922]: 2025-12-01 09:16:53.340870503 +0000 UTC m=+0.842910532 container died 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:16:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-d2a48ce4fda5e118f4a811f4858503a33d1757c6f2c4acc1b074b5c305d10a54-merged.mount: Deactivated successfully.
Dec 01 09:16:53 compute-0 podman[104922]: 2025-12-01 09:16:53.401842397 +0000 UTC m=+0.903882446 container remove 55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_mirzakhani, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:16:53 compute-0 systemd[1]: libpod-conmon-55d006748401b1455d628831f8e90bd14427336ef5aa8a607365641eaea749f6.scope: Deactivated successfully.
Dec 01 09:16:53 compute-0 sudo[104821]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:53 compute-0 sudo[104961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:53 compute-0 sudo[104961]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:53 compute-0 sudo[104961]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:53 compute-0 sudo[104986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:16:53 compute-0 sudo[104986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:53 compute-0 sudo[104986]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:53 compute-0 sudo[105011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:53 compute-0 sudo[105011]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:53 compute-0 sudo[105011]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:53 compute-0 sudo[105036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:16:53 compute-0 sudo[105036]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:53 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec 01 09:16:53 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec 01 09:16:54 compute-0 podman[105102]: 2025-12-01 09:16:54.001781259 +0000 UTC m=+0.048296714 container create a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:16:54 compute-0 systemd[1]: Started libpod-conmon-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope.
Dec 01 09:16:54 compute-0 podman[105102]: 2025-12-01 09:16:53.979619239 +0000 UTC m=+0.026134744 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:16:54 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:16:54 compute-0 podman[105102]: 2025-12-01 09:16:54.089933606 +0000 UTC m=+0.136449061 container init a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:16:54 compute-0 podman[105102]: 2025-12-01 09:16:54.097590144 +0000 UTC m=+0.144105599 container start a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:16:54 compute-0 podman[105102]: 2025-12-01 09:16:54.101056762 +0000 UTC m=+0.147572247 container attach a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:16:54 compute-0 systemd[1]: libpod-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope: Deactivated successfully.
Dec 01 09:16:54 compute-0 admiring_bhaskara[105118]: 167 167
Dec 01 09:16:54 compute-0 conmon[105118]: conmon a08071b70d525a2316a9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope/container/memory.events
Dec 01 09:16:54 compute-0 podman[105102]: 2025-12-01 09:16:54.10413414 +0000 UTC m=+0.150649595 container died a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:16:54 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 01 09:16:54 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 01 09:16:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-58949cd145715e7c1ecb3eaf349e3f39b4899de6216dc16df083dd08b9efa073-merged.mount: Deactivated successfully.
Dec 01 09:16:54 compute-0 podman[105102]: 2025-12-01 09:16:54.138690053 +0000 UTC m=+0.185205508 container remove a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhaskara, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:16:54 compute-0 systemd[1]: libpod-conmon-a08071b70d525a2316a99dad6e045a7ca2b02ec2601669d87b971937791775c6.scope: Deactivated successfully.
Dec 01 09:16:54 compute-0 ceph-mon[75031]: 6.8 scrub starts
Dec 01 09:16:54 compute-0 ceph-mon[75031]: 6.8 scrub ok
Dec 01 09:16:54 compute-0 ceph-mon[75031]: pgmap v148: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:54 compute-0 ceph-mon[75031]: 6.14 scrub starts
Dec 01 09:16:54 compute-0 ceph-mon[75031]: 6.14 scrub ok
Dec 01 09:16:54 compute-0 podman[105144]: 2025-12-01 09:16:54.293870026 +0000 UTC m=+0.038290090 container create 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:16:54 compute-0 systemd[1]: Started libpod-conmon-87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9.scope.
Dec 01 09:16:54 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:16:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:16:54 compute-0 podman[105144]: 2025-12-01 09:16:54.279655571 +0000 UTC m=+0.024075655 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:16:54 compute-0 podman[105144]: 2025-12-01 09:16:54.378168523 +0000 UTC m=+0.122588597 container init 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:16:54 compute-0 podman[105144]: 2025-12-01 09:16:54.387063816 +0000 UTC m=+0.131483880 container start 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:16:54 compute-0 podman[105144]: 2025-12-01 09:16:54.390695519 +0000 UTC m=+0.135115603 container attach 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:16:54 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Dec 01 09:16:54 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Dec 01 09:16:54 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec 01 09:16:54 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec 01 09:16:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v149: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:55 compute-0 ceph-mon[75031]: 2.9 scrub starts
Dec 01 09:16:55 compute-0 ceph-mon[75031]: 2.9 scrub ok
Dec 01 09:16:55 compute-0 ceph-mon[75031]: 5.2 scrub starts
Dec 01 09:16:55 compute-0 ceph-mon[75031]: 5.2 scrub ok
Dec 01 09:16:55 compute-0 ceph-mon[75031]: 4.13 scrub starts
Dec 01 09:16:55 compute-0 ceph-mon[75031]: 4.13 scrub ok
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]: {
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "osd_id": 0,
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "type": "bluestore"
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:     },
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "osd_id": 1,
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "type": "bluestore"
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:     },
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "osd_id": 2,
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:         "type": "bluestore"
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]:     }
Dec 01 09:16:55 compute-0 eloquent_heyrovsky[105160]: }
Dec 01 09:16:55 compute-0 systemd[1]: libpod-87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9.scope: Deactivated successfully.
Dec 01 09:16:55 compute-0 podman[105144]: 2025-12-01 09:16:55.378266844 +0000 UTC m=+1.122686918 container died 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:16:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-e34c327657f84cb9190bedb62a4a35481642ffc67503f0522f1c1bde4861971e-merged.mount: Deactivated successfully.
Dec 01 09:16:55 compute-0 podman[105144]: 2025-12-01 09:16:55.432721433 +0000 UTC m=+1.177141497 container remove 87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_heyrovsky, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:16:55 compute-0 systemd[1]: libpod-conmon-87edd6334ff0daf574aa3ae080250968632330c03c2a723b7df4fa24eae6f2b9.scope: Deactivated successfully.
Dec 01 09:16:55 compute-0 sudo[105036]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:16:55 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:16:55 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:55 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 1c2fefe1-da38-4e99-98df-facfe1210edb does not exist
Dec 01 09:16:55 compute-0 sudo[105205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:16:55 compute-0 sudo[105205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:55 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec 01 09:16:55 compute-0 sudo[105205]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:55 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec 01 09:16:55 compute-0 sudo[105230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:16:55 compute-0 sudo[105230]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:16:55 compute-0 sudo[105230]: pam_unix(sudo:session): session closed for user root
Dec 01 09:16:56 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec 01 09:16:56 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec 01 09:16:56 compute-0 ceph-mon[75031]: pgmap v149: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:16:56 compute-0 ceph-mon[75031]: 6.11 scrub starts
Dec 01 09:16:56 compute-0 ceph-mon[75031]: 6.11 scrub ok
Dec 01 09:16:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v150: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:57 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 01 09:16:57 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 01 09:16:57 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Dec 01 09:16:57 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Dec 01 09:16:57 compute-0 ceph-mon[75031]: 4.d scrub starts
Dec 01 09:16:57 compute-0 ceph-mon[75031]: 4.d scrub ok
Dec 01 09:16:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:16:58 compute-0 ceph-mon[75031]: pgmap v150: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:58 compute-0 ceph-mon[75031]: 6.c scrub starts
Dec 01 09:16:58 compute-0 ceph-mon[75031]: 6.c scrub ok
Dec 01 09:16:58 compute-0 ceph-mon[75031]: 2.1d scrub starts
Dec 01 09:16:58 compute-0 ceph-mon[75031]: 2.1d scrub ok
Dec 01 09:16:59 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 01 09:16:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v151: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:16:59 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 01 09:16:59 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec 01 09:16:59 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec 01 09:16:59 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 01 09:16:59 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 01 09:17:00 compute-0 ceph-mon[75031]: 2.6 scrub starts
Dec 01 09:17:00 compute-0 ceph-mon[75031]: pgmap v151: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:00 compute-0 ceph-mon[75031]: 2.6 scrub ok
Dec 01 09:17:00 compute-0 ceph-mon[75031]: 2.1f scrub starts
Dec 01 09:17:00 compute-0 ceph-mon[75031]: 2.1f scrub ok
Dec 01 09:17:00 compute-0 ceph-mon[75031]: 4.11 scrub starts
Dec 01 09:17:00 compute-0 ceph-mon[75031]: 4.11 scrub ok
Dec 01 09:17:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v152: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:01 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Dec 01 09:17:01 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Dec 01 09:17:01 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec 01 09:17:01 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec 01 09:17:02 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec 01 09:17:02 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec 01 09:17:02 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Dec 01 09:17:02 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Dec 01 09:17:02 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Dec 01 09:17:02 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Dec 01 09:17:02 compute-0 ceph-mon[75031]: pgmap v152: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:02 compute-0 ceph-mon[75031]: 5.c deep-scrub starts
Dec 01 09:17:02 compute-0 ceph-mon[75031]: 5.c deep-scrub ok
Dec 01 09:17:02 compute-0 ceph-mon[75031]: 6.13 scrub starts
Dec 01 09:17:02 compute-0 ceph-mon[75031]: 6.13 scrub ok
Dec 01 09:17:03 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub starts
Dec 01 09:17:03 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub ok
Dec 01 09:17:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v153: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:03 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub starts
Dec 01 09:17:03 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub ok
Dec 01 09:17:03 compute-0 ceph-mon[75031]: 6.d scrub starts
Dec 01 09:17:03 compute-0 ceph-mon[75031]: 6.d scrub ok
Dec 01 09:17:03 compute-0 ceph-mon[75031]: 2.1c scrub starts
Dec 01 09:17:03 compute-0 ceph-mon[75031]: 2.1c scrub ok
Dec 01 09:17:03 compute-0 ceph-mon[75031]: 6.15 scrub starts
Dec 01 09:17:03 compute-0 ceph-mon[75031]: 6.15 scrub ok
Dec 01 09:17:04 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 01 09:17:04 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 01 09:17:04 compute-0 ceph-mon[75031]: 5.1 deep-scrub starts
Dec 01 09:17:04 compute-0 ceph-mon[75031]: 5.1 deep-scrub ok
Dec 01 09:17:04 compute-0 ceph-mon[75031]: pgmap v153: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:04 compute-0 ceph-mon[75031]: 6.1f deep-scrub starts
Dec 01 09:17:04 compute-0 ceph-mon[75031]: 6.1f deep-scrub ok
Dec 01 09:17:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts
Dec 01 09:17:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok
Dec 01 09:17:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v154: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:05 compute-0 ceph-mon[75031]: 4.f scrub starts
Dec 01 09:17:05 compute-0 ceph-mon[75031]: 4.f scrub ok
Dec 01 09:17:06 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec 01 09:17:06 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec 01 09:17:06 compute-0 ceph-mon[75031]: 2.5 deep-scrub starts
Dec 01 09:17:06 compute-0 ceph-mon[75031]: 2.5 deep-scrub ok
Dec 01 09:17:06 compute-0 ceph-mon[75031]: pgmap v154: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:06 compute-0 ceph-mon[75031]: 2.7 scrub starts
Dec 01 09:17:06 compute-0 ceph-mon[75031]: 2.7 scrub ok
Dec 01 09:17:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v155: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:07 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec 01 09:17:07 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec 01 09:17:07 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec 01 09:17:07 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec 01 09:17:07 compute-0 ceph-mon[75031]: 3.17 scrub starts
Dec 01 09:17:07 compute-0 ceph-mon[75031]: 3.17 scrub ok
Dec 01 09:17:08 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec 01 09:17:08 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec 01 09:17:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:08 compute-0 ceph-mon[75031]: pgmap v155: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:08 compute-0 ceph-mon[75031]: 4.1c scrub starts
Dec 01 09:17:08 compute-0 ceph-mon[75031]: 4.1c scrub ok
Dec 01 09:17:08 compute-0 ceph-mon[75031]: 7.13 scrub starts
Dec 01 09:17:08 compute-0 ceph-mon[75031]: 7.13 scrub ok
Dec 01 09:17:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v156: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:09 compute-0 ceph-mon[75031]: pgmap v156: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v157: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:11 compute-0 ceph-mon[75031]: pgmap v157: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:12 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec 01 09:17:12 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec 01 09:17:12 compute-0 ceph-mon[75031]: 3.15 scrub starts
Dec 01 09:17:12 compute-0 ceph-mon[75031]: 3.15 scrub ok
Dec 01 09:17:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:17:12
Dec 01 09:17:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:17:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:17:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['vms', 'cephfs.cephfs.data', 'backups', 'cephfs.cephfs.meta', 'volumes', 'images', '.mgr']
Dec 01 09:17:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:17:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v158: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:13 compute-0 ceph-mon[75031]: pgmap v158: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:14 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Dec 01 09:17:14 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Dec 01 09:17:14 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 01 09:17:14 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 01 09:17:14 compute-0 ceph-mon[75031]: 2.3 deep-scrub starts
Dec 01 09:17:14 compute-0 ceph-mon[75031]: 2.3 deep-scrub ok
Dec 01 09:17:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v159: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:15 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec 01 09:17:15 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec 01 09:17:15 compute-0 ceph-mon[75031]: 3.18 scrub starts
Dec 01 09:17:15 compute-0 ceph-mon[75031]: 3.18 scrub ok
Dec 01 09:17:15 compute-0 ceph-mon[75031]: pgmap v159: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:16 compute-0 ceph-mon[75031]: 7.1c scrub starts
Dec 01 09:17:16 compute-0 ceph-mon[75031]: 7.1c scrub ok
Dec 01 09:17:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v160: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:17 compute-0 ceph-mon[75031]: pgmap v160: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:18 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 01 09:17:18 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 01 09:17:18 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec 01 09:17:18 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:17:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:17:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:18 compute-0 ceph-mon[75031]: 2.4 scrub starts
Dec 01 09:17:18 compute-0 ceph-mon[75031]: 2.4 scrub ok
Dec 01 09:17:18 compute-0 ceph-mon[75031]: 3.12 scrub starts
Dec 01 09:17:18 compute-0 ceph-mon[75031]: 3.12 scrub ok
Dec 01 09:17:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v161: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:19 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec 01 09:17:19 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec 01 09:17:19 compute-0 ceph-mon[75031]: pgmap v161: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:19 compute-0 ceph-mon[75031]: 3.f scrub starts
Dec 01 09:17:19 compute-0 ceph-mon[75031]: 3.f scrub ok
Dec 01 09:17:20 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec 01 09:17:20 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec 01 09:17:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v162: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:21 compute-0 ceph-mon[75031]: pgmap v162: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:22 compute-0 ceph-mon[75031]: 3.16 scrub starts
Dec 01 09:17:22 compute-0 ceph-mon[75031]: 3.16 scrub ok
Dec 01 09:17:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v163: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:23 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec 01 09:17:23 compute-0 ceph-mon[75031]: pgmap v163: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:23 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec 01 09:17:23 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 01 09:17:23 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 01 09:17:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:24 compute-0 ceph-mon[75031]: 5.9 scrub starts
Dec 01 09:17:24 compute-0 ceph-mon[75031]: 5.9 scrub ok
Dec 01 09:17:24 compute-0 ceph-mon[75031]: 7.9 scrub starts
Dec 01 09:17:24 compute-0 ceph-mon[75031]: 7.9 scrub ok
Dec 01 09:17:24 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub starts
Dec 01 09:17:24 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub ok
Dec 01 09:17:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v164: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:25 compute-0 ceph-mon[75031]: 3.c deep-scrub starts
Dec 01 09:17:25 compute-0 ceph-mon[75031]: 3.c deep-scrub ok
Dec 01 09:17:25 compute-0 ceph-mon[75031]: pgmap v164: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec 01 09:17:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec 01 09:17:26 compute-0 ceph-mon[75031]: 5.16 scrub starts
Dec 01 09:17:26 compute-0 ceph-mon[75031]: 5.16 scrub ok
Dec 01 09:17:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v165: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:27 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec 01 09:17:27 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec 01 09:17:27 compute-0 ceph-mon[75031]: pgmap v165: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:27 compute-0 ceph-mon[75031]: 7.f scrub starts
Dec 01 09:17:27 compute-0 ceph-mon[75031]: 7.f scrub ok
Dec 01 09:17:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:28 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec 01 09:17:28 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec 01 09:17:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v166: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:29 compute-0 ceph-mon[75031]: pgmap v166: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:29 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 01 09:17:29 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 01 09:17:30 compute-0 ceph-mon[75031]: 3.11 scrub starts
Dec 01 09:17:30 compute-0 ceph-mon[75031]: 3.11 scrub ok
Dec 01 09:17:30 compute-0 ceph-mon[75031]: 7.6 scrub starts
Dec 01 09:17:30 compute-0 ceph-mon[75031]: 7.6 scrub ok
Dec 01 09:17:30 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec 01 09:17:30 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec 01 09:17:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v167: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:31 compute-0 ceph-mon[75031]: 2.15 scrub starts
Dec 01 09:17:31 compute-0 ceph-mon[75031]: 2.15 scrub ok
Dec 01 09:17:31 compute-0 ceph-mon[75031]: pgmap v167: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec 01 09:17:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec 01 09:17:32 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub starts
Dec 01 09:17:32 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub ok
Dec 01 09:17:32 compute-0 ceph-mon[75031]: 5.12 deep-scrub starts
Dec 01 09:17:32 compute-0 ceph-mon[75031]: 5.12 deep-scrub ok
Dec 01 09:17:32 compute-0 ceph-mon[75031]: 3.6 deep-scrub starts
Dec 01 09:17:32 compute-0 ceph-mon[75031]: 3.6 deep-scrub ok
Dec 01 09:17:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v168: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:33 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Dec 01 09:17:33 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Dec 01 09:17:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:33 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec 01 09:17:33 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec 01 09:17:33 compute-0 ceph-mon[75031]: pgmap v168: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:33 compute-0 ceph-mon[75031]: 7.3 scrub starts
Dec 01 09:17:33 compute-0 ceph-mon[75031]: 7.3 scrub ok
Dec 01 09:17:34 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Dec 01 09:17:34 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Dec 01 09:17:34 compute-0 ceph-mon[75031]: 7.15 scrub starts
Dec 01 09:17:34 compute-0 ceph-mon[75031]: 7.15 scrub ok
Dec 01 09:17:34 compute-0 ceph-mon[75031]: 3.9 scrub starts
Dec 01 09:17:34 compute-0 ceph-mon[75031]: 3.9 scrub ok
Dec 01 09:17:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v169: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:35 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub starts
Dec 01 09:17:35 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub ok
Dec 01 09:17:35 compute-0 ceph-mon[75031]: pgmap v169: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:35 compute-0 ceph-mon[75031]: 3.a scrub starts
Dec 01 09:17:35 compute-0 ceph-mon[75031]: 3.a scrub ok
Dec 01 09:17:36 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec 01 09:17:36 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec 01 09:17:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v170: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:37 compute-0 ceph-mon[75031]: pgmap v170: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec 01 09:17:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec 01 09:17:37 compute-0 sudo[103989]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:38 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 01 09:17:38 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 01 09:17:38 compute-0 ceph-mon[75031]: 7.11 scrub starts
Dec 01 09:17:38 compute-0 ceph-mon[75031]: 7.11 scrub ok
Dec 01 09:17:38 compute-0 ceph-mon[75031]: 7.4 scrub starts
Dec 01 09:17:38 compute-0 ceph-mon[75031]: 7.4 scrub ok
Dec 01 09:17:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:38 compute-0 sudo[105481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edbyvnekhhevjcuoohucpjkhsnwinxjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580658.1690032-137-19701628234576/AnsiballZ_command.py'
Dec 01 09:17:38 compute-0 sudo[105481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:38 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 01 09:17:38 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 01 09:17:38 compute-0 python3.9[105483]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:17:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v171: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:39 compute-0 ceph-mon[75031]: 5.13 scrub starts
Dec 01 09:17:39 compute-0 ceph-mon[75031]: 5.13 scrub ok
Dec 01 09:17:39 compute-0 ceph-mon[75031]: pgmap v171: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:39 compute-0 sudo[105481]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:40 compute-0 sudo[105768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayygvetogynpmgrjpmfbvrmodmqntoar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580659.4764323-145-227320761361827/AnsiballZ_selinux.py'
Dec 01 09:17:40 compute-0 sudo[105768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:40 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec 01 09:17:40 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec 01 09:17:40 compute-0 ceph-mon[75031]: 3.e scrub starts
Dec 01 09:17:40 compute-0 ceph-mon[75031]: 3.e scrub ok
Dec 01 09:17:40 compute-0 python3.9[105770]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 01 09:17:40 compute-0 sudo[105768]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v172: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:41 compute-0 sudo[105920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ircwmfiseudcbkdrciilsxusyhtjapjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580660.8057384-156-114642562631851/AnsiballZ_command.py'
Dec 01 09:17:41 compute-0 sudo[105920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:41 compute-0 ceph-mon[75031]: 2.17 scrub starts
Dec 01 09:17:41 compute-0 ceph-mon[75031]: 2.17 scrub ok
Dec 01 09:17:41 compute-0 ceph-mon[75031]: pgmap v172: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:41 compute-0 python3.9[105922]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 01 09:17:41 compute-0 sudo[105920]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:41 compute-0 sudo[106072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoriqthjnlibsnrqnffupxgntavoftua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580661.5004134-164-72355845396041/AnsiballZ_file.py'
Dec 01 09:17:41 compute-0 sudo[106072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:41 compute-0 python3.9[106074]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:17:42 compute-0 sudo[106072]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:42 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 01 09:17:42 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 01 09:17:42 compute-0 sudo[106224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfkbbhpeiaqljulztaunloybuosyswjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580662.1725786-172-25482077176818/AnsiballZ_mount.py'
Dec 01 09:17:42 compute-0 sudo[106224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:42 compute-0 python3.9[106226]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 01 09:17:42 compute-0 sudo[106224]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:17:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:17:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:17:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:17:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:17:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:17:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v173: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:43 compute-0 ceph-mon[75031]: pgmap v173: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:43 compute-0 sudo[106376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shiyhnohevmzxetrnedpfnvxvjaubawp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580663.6754606-200-166636539118758/AnsiballZ_file.py'
Dec 01 09:17:43 compute-0 sudo[106376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:44 compute-0 python3.9[106378]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:17:44 compute-0 sudo[106376]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:44 compute-0 ceph-mon[75031]: 7.a scrub starts
Dec 01 09:17:44 compute-0 ceph-mon[75031]: 7.a scrub ok
Dec 01 09:17:44 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec 01 09:17:44 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec 01 09:17:44 compute-0 sudo[106528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szoegffpcbtzzccahaeuybxmaflzqwzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580664.2874596-208-210419884092513/AnsiballZ_stat.py'
Dec 01 09:17:44 compute-0 sudo[106528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:44 compute-0 python3.9[106530]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:17:44 compute-0 sudo[106528]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:44 compute-0 sudo[106606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqwgmlvaadafyplixdxvlcnuhfaagnlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580664.2874596-208-210419884092513/AnsiballZ_file.py'
Dec 01 09:17:44 compute-0 sudo[106606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v174: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:45 compute-0 python3.9[106608]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:17:45 compute-0 sudo[106606]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:45 compute-0 ceph-mon[75031]: 3.1b scrub starts
Dec 01 09:17:45 compute-0 ceph-mon[75031]: 3.1b scrub ok
Dec 01 09:17:45 compute-0 ceph-mon[75031]: pgmap v174: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:45 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec 01 09:17:45 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec 01 09:17:45 compute-0 sudo[106758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrmoxemkfrjhvncgwykyhbovakiquuii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580665.6480904-229-142854417337728/AnsiballZ_stat.py'
Dec 01 09:17:45 compute-0 sudo[106758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:46 compute-0 python3.9[106760]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:17:46 compute-0 sudo[106758]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:46 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec 01 09:17:46 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec 01 09:17:46 compute-0 ceph-mon[75031]: 7.18 scrub starts
Dec 01 09:17:46 compute-0 ceph-mon[75031]: 7.18 scrub ok
Dec 01 09:17:46 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec 01 09:17:46 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec 01 09:17:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v175: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:47 compute-0 sudo[106912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqxxqimiqwsqkgmqeuzhnybgzrqbyzng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580666.7365127-242-7019209473982/AnsiballZ_getent.py'
Dec 01 09:17:47 compute-0 sudo[106912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:47 compute-0 ceph-mon[75031]: 2.a scrub starts
Dec 01 09:17:47 compute-0 ceph-mon[75031]: 2.a scrub ok
Dec 01 09:17:47 compute-0 ceph-mon[75031]: 7.1f scrub starts
Dec 01 09:17:47 compute-0 ceph-mon[75031]: 7.1f scrub ok
Dec 01 09:17:47 compute-0 ceph-mon[75031]: pgmap v175: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:47 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec 01 09:17:47 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec 01 09:17:47 compute-0 python3.9[106914]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 01 09:17:47 compute-0 sudo[106912]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:47 compute-0 sudo[107065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfpbirpoxseedotckjkqqamfukztnuad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580667.6728857-252-224061368417197/AnsiballZ_getent.py'
Dec 01 09:17:47 compute-0 sudo[107065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:48 compute-0 python3.9[107067]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 01 09:17:48 compute-0 sudo[107065]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:48 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec 01 09:17:48 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec 01 09:17:48 compute-0 ceph-mon[75031]: 3.1f scrub starts
Dec 01 09:17:48 compute-0 ceph-mon[75031]: 3.1f scrub ok
Dec 01 09:17:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:48 compute-0 sudo[107218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvfwsxirtsrftjznxmqjpnxjopdyedyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580668.291741-260-185303789547927/AnsiballZ_group.py'
Dec 01 09:17:48 compute-0 sudo[107218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:48 compute-0 python3.9[107220]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:17:49 compute-0 sudo[107218]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v176: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:49 compute-0 ceph-mon[75031]: 2.d scrub starts
Dec 01 09:17:49 compute-0 ceph-mon[75031]: 2.d scrub ok
Dec 01 09:17:49 compute-0 ceph-mon[75031]: pgmap v176: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:49 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec 01 09:17:49 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec 01 09:17:49 compute-0 sudo[107370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryohbpruqknlztkdnpvnmhdvqdchuxyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580669.2057266-269-136662488169189/AnsiballZ_file.py'
Dec 01 09:17:49 compute-0 sudo[107370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:49 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 01 09:17:49 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 01 09:17:49 compute-0 python3.9[107372]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 01 09:17:49 compute-0 sudo[107370]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:50 compute-0 sudo[107522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wthfroabmvnhwnukowrbthmzbbiokgod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580670.0130517-280-172898014698732/AnsiballZ_dnf.py'
Dec 01 09:17:50 compute-0 sudo[107522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:50 compute-0 ceph-mon[75031]: 7.1b scrub starts
Dec 01 09:17:50 compute-0 ceph-mon[75031]: 7.1b scrub ok
Dec 01 09:17:50 compute-0 ceph-mon[75031]: 7.8 scrub starts
Dec 01 09:17:50 compute-0 ceph-mon[75031]: 7.8 scrub ok
Dec 01 09:17:50 compute-0 python3.9[107524]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:17:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v177: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:51 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 01 09:17:51 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 01 09:17:51 compute-0 ceph-mon[75031]: pgmap v177: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:52 compute-0 sudo[107522]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:52 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 01 09:17:52 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 01 09:17:52 compute-0 ceph-mon[75031]: 6.6 scrub starts
Dec 01 09:17:52 compute-0 ceph-mon[75031]: 6.6 scrub ok
Dec 01 09:17:52 compute-0 sudo[107675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quuyyqumayrkpvjakuoojmfkptlrebsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580672.2274528-288-91550764237063/AnsiballZ_file.py'
Dec 01 09:17:52 compute-0 sudo[107675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:52 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 01 09:17:52 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 01 09:17:52 compute-0 python3.9[107677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:17:52 compute-0 sudo[107675]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v178: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:53 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 01 09:17:53 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 01 09:17:53 compute-0 ceph-mon[75031]: 4.4 scrub starts
Dec 01 09:17:53 compute-0 ceph-mon[75031]: 4.4 scrub ok
Dec 01 09:17:53 compute-0 ceph-mon[75031]: 7.5 scrub starts
Dec 01 09:17:53 compute-0 ceph-mon[75031]: 7.5 scrub ok
Dec 01 09:17:53 compute-0 ceph-mon[75031]: pgmap v178: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:53 compute-0 sudo[107827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhgnfnmmrjdegmulznijetxrtzocxgqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580672.9649298-296-156057178580086/AnsiballZ_stat.py'
Dec 01 09:17:53 compute-0 sudo[107827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:53 compute-0 python3.9[107829]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:17:53 compute-0 sudo[107827]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:53 compute-0 sudo[107905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmbkkyjggbioezkvdjixmcrkipemloxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580672.9649298-296-156057178580086/AnsiballZ_file.py'
Dec 01 09:17:53 compute-0 sudo[107905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:54 compute-0 python3.9[107907]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:17:54 compute-0 sudo[107905]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:54 compute-0 ceph-mon[75031]: 6.1 scrub starts
Dec 01 09:17:54 compute-0 ceph-mon[75031]: 6.1 scrub ok
Dec 01 09:17:54 compute-0 sudo[108057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdnzgrhusbmjlvnoavebncqgcmypcxlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580674.2784338-309-218424548543696/AnsiballZ_stat.py'
Dec 01 09:17:54 compute-0 sudo[108057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:54 compute-0 python3.9[108059]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:17:54 compute-0 sudo[108057]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:54 compute-0 sudo[108135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajkewczgdqqzxwnnhhqfzwhkbugdxfxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580674.2784338-309-218424548543696/AnsiballZ_file.py'
Dec 01 09:17:54 compute-0 sudo[108135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v179: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:55 compute-0 python3.9[108137]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:17:55 compute-0 sudo[108135]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:55 compute-0 sudo[108223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:17:55 compute-0 sudo[108223]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:55 compute-0 sudo[108223]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:55 compute-0 sudo[108269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:17:55 compute-0 sudo[108269]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:55 compute-0 sudo[108269]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:55 compute-0 sudo[108358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpskekpfxoslsivbwilabkalnujttfin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580675.544807-324-14844480647689/AnsiballZ_dnf.py'
Dec 01 09:17:55 compute-0 sudo[108358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:17:55 compute-0 sudo[108318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:17:55 compute-0 sudo[108318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:55 compute-0 sudo[108318]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:55 compute-0 sudo[108365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:17:55 compute-0 sudo[108365]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:56 compute-0 python3.9[108363]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:17:56 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 01 09:17:56 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 01 09:17:56 compute-0 sudo[108365]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:17:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:17:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:17:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:17:56 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 508a59e6-abeb-4f7e-84f6-648341ece6d2 does not exist
Dec 01 09:17:56 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 601325cf-341b-4646-8184-e7d4acb0a43a does not exist
Dec 01 09:17:56 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 8722566b-2e50-492c-b4f2-e2f4f2e5c366 does not exist
Dec 01 09:17:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:17:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:17:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:17:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: pgmap v179: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:17:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:17:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:17:56 compute-0 sudo[108422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:17:56 compute-0 sudo[108422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:56 compute-0 sudo[108422]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:56 compute-0 sudo[108447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:17:56 compute-0 sudo[108447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:56 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Dec 01 09:17:56 compute-0 sudo[108447]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:56 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Dec 01 09:17:56 compute-0 sudo[108472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:17:56 compute-0 sudo[108472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:56 compute-0 sudo[108472]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:56 compute-0 sudo[108497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:17:56 compute-0 sudo[108497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:56 compute-0 podman[108560]: 2025-12-01 09:17:56.99070135 +0000 UTC m=+0.047149136 container create 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec 01 09:17:57 compute-0 systemd[1]: Started libpod-conmon-0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609.scope.
Dec 01 09:17:57 compute-0 podman[108560]: 2025-12-01 09:17:56.969155435 +0000 UTC m=+0.025603221 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:17:57 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:17:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v180: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:57 compute-0 podman[108560]: 2025-12-01 09:17:57.102937712 +0000 UTC m=+0.159385538 container init 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:17:57 compute-0 podman[108560]: 2025-12-01 09:17:57.114221654 +0000 UTC m=+0.170669440 container start 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:17:57 compute-0 podman[108560]: 2025-12-01 09:17:57.119265758 +0000 UTC m=+0.175713524 container attach 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:17:57 compute-0 nice_jennings[108576]: 167 167
Dec 01 09:17:57 compute-0 systemd[1]: libpod-0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609.scope: Deactivated successfully.
Dec 01 09:17:57 compute-0 podman[108560]: 2025-12-01 09:17:57.123329384 +0000 UTC m=+0.179777160 container died 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:17:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-2af10e1e43a76bdf5d8e118a39cc600d273d25af88865c277b99ecfd2945f6c3-merged.mount: Deactivated successfully.
Dec 01 09:17:57 compute-0 podman[108560]: 2025-12-01 09:17:57.160839644 +0000 UTC m=+0.217287410 container remove 0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_jennings, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:17:57 compute-0 systemd[1]: libpod-conmon-0a120bc90152132962c41f8d4c18f0d1cf7ae252a21c5e5c69ab083429f31609.scope: Deactivated successfully.
Dec 01 09:17:57 compute-0 podman[108601]: 2025-12-01 09:17:57.338871163 +0000 UTC m=+0.046195049 container create 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:17:57 compute-0 systemd[1]: Started libpod-conmon-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope.
Dec 01 09:17:57 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:17:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:57 compute-0 podman[108601]: 2025-12-01 09:17:57.318515303 +0000 UTC m=+0.025839219 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:17:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:57 compute-0 ceph-mon[75031]: 4.7 scrub starts
Dec 01 09:17:57 compute-0 ceph-mon[75031]: 4.7 scrub ok
Dec 01 09:17:57 compute-0 ceph-mon[75031]: 7.2 deep-scrub starts
Dec 01 09:17:57 compute-0 ceph-mon[75031]: 7.2 deep-scrub ok
Dec 01 09:17:57 compute-0 podman[108601]: 2025-12-01 09:17:57.444013473 +0000 UTC m=+0.151337389 container init 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:17:57 compute-0 podman[108601]: 2025-12-01 09:17:57.452072913 +0000 UTC m=+0.159396799 container start 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:17:57 compute-0 podman[108601]: 2025-12-01 09:17:57.457316093 +0000 UTC m=+0.164640009 container attach 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec 01 09:17:57 compute-0 sudo[108358]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:58 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 01 09:17:58 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 01 09:17:58 compute-0 ceph-mon[75031]: pgmap v180: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:58 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec 01 09:17:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:17:58 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec 01 09:17:58 compute-0 eager_yalow[108618]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:17:58 compute-0 eager_yalow[108618]: --> relative data size: 1.0
Dec 01 09:17:58 compute-0 eager_yalow[108618]: --> All data devices are unavailable
Dec 01 09:17:58 compute-0 python3.9[108786]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:17:58 compute-0 systemd[1]: libpod-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope: Deactivated successfully.
Dec 01 09:17:58 compute-0 systemd[1]: libpod-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope: Consumed 1.037s CPU time.
Dec 01 09:17:58 compute-0 podman[108601]: 2025-12-01 09:17:58.561190705 +0000 UTC m=+1.268514591 container died 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:17:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-eb5b3d103514b8c4fb6619ee9c66c4f491a680a111199139f2229db15586aeb0-merged.mount: Deactivated successfully.
Dec 01 09:17:58 compute-0 podman[108601]: 2025-12-01 09:17:58.617127291 +0000 UTC m=+1.324451167 container remove 56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_yalow, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:17:58 compute-0 systemd[1]: libpod-conmon-56a2fd1b90953085dfac16777582664409cb1591b681e33b06549911190df439.scope: Deactivated successfully.
Dec 01 09:17:58 compute-0 sudo[108497]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:58 compute-0 sudo[108835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:17:58 compute-0 sudo[108835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:58 compute-0 sudo[108835]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:58 compute-0 sudo[108863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:17:58 compute-0 sudo[108863]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:58 compute-0 sudo[108863]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:58 compute-0 sudo[108926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:17:58 compute-0 sudo[108926]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:58 compute-0 sudo[108926]: pam_unix(sudo:session): session closed for user root
Dec 01 09:17:58 compute-0 sudo[108962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:17:58 compute-0 sudo[108962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:17:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v181: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:17:59 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec 01 09:17:59 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec 01 09:17:59 compute-0 podman[109097]: 2025-12-01 09:17:59.284649954 +0000 UTC m=+0.046818486 container create ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:17:59 compute-0 systemd[1]: Started libpod-conmon-ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c.scope.
Dec 01 09:17:59 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:17:59 compute-0 podman[109097]: 2025-12-01 09:17:59.261084922 +0000 UTC m=+0.023253474 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:17:59 compute-0 python3.9[109091]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 01 09:17:59 compute-0 podman[109097]: 2025-12-01 09:17:59.367429806 +0000 UTC m=+0.129598368 container init ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:17:59 compute-0 podman[109097]: 2025-12-01 09:17:59.379973894 +0000 UTC m=+0.142142436 container start ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:17:59 compute-0 infallible_einstein[109114]: 167 167
Dec 01 09:17:59 compute-0 podman[109097]: 2025-12-01 09:17:59.385918204 +0000 UTC m=+0.148086776 container attach ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:17:59 compute-0 systemd[1]: libpod-ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c.scope: Deactivated successfully.
Dec 01 09:17:59 compute-0 podman[109097]: 2025-12-01 09:17:59.387595281 +0000 UTC m=+0.149763883 container died ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:17:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-4d61e15a1c8425063b55a03892a92ef5a382207c04fdd320093c10c278c8b561-merged.mount: Deactivated successfully.
Dec 01 09:17:59 compute-0 podman[109097]: 2025-12-01 09:17:59.446527863 +0000 UTC m=+0.208696435 container remove ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_einstein, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:17:59 compute-0 systemd[1]: libpod-conmon-ef16fd737e7908c6e0bdb4f80c1a78c3482e5a7681a1c0fd3627061a75784c8c.scope: Deactivated successfully.
Dec 01 09:17:59 compute-0 ceph-mon[75031]: 6.b scrub starts
Dec 01 09:17:59 compute-0 ceph-mon[75031]: 6.b scrub ok
Dec 01 09:17:59 compute-0 ceph-mon[75031]: 3.8 scrub starts
Dec 01 09:17:59 compute-0 ceph-mon[75031]: 3.8 scrub ok
Dec 01 09:17:59 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub starts
Dec 01 09:17:59 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub ok
Dec 01 09:17:59 compute-0 podman[109184]: 2025-12-01 09:17:59.649969667 +0000 UTC m=+0.048403602 container create 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:17:59 compute-0 systemd[1]: Started libpod-conmon-3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2.scope.
Dec 01 09:17:59 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:17:59 compute-0 podman[109184]: 2025-12-01 09:17:59.629860743 +0000 UTC m=+0.028294698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:17:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:17:59 compute-0 podman[109184]: 2025-12-01 09:17:59.744527294 +0000 UTC m=+0.142961259 container init 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:17:59 compute-0 podman[109184]: 2025-12-01 09:17:59.751609727 +0000 UTC m=+0.150043672 container start 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:17:59 compute-0 podman[109184]: 2025-12-01 09:17:59.755676332 +0000 UTC m=+0.154110287 container attach 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:18:00 compute-0 python3.9[109309]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:18:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec 01 09:18:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec 01 09:18:00 compute-0 ceph-mon[75031]: pgmap v181: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:00 compute-0 ceph-mon[75031]: 4.5 scrub starts
Dec 01 09:18:00 compute-0 ceph-mon[75031]: 4.5 scrub ok
Dec 01 09:18:00 compute-0 ceph-mon[75031]: 7.c deep-scrub starts
Dec 01 09:18:00 compute-0 ceph-mon[75031]: 7.c deep-scrub ok
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]: {
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:     "0": [
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:         {
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "devices": [
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "/dev/loop3"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             ],
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_name": "ceph_lv0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_size": "21470642176",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "name": "ceph_lv0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "tags": {
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cluster_name": "ceph",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.crush_device_class": "",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.encrypted": "0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osd_id": "0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.type": "block",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.vdo": "0"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             },
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "type": "block",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "vg_name": "ceph_vg0"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:         }
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:     ],
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:     "1": [
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:         {
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "devices": [
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "/dev/loop4"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             ],
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_name": "ceph_lv1",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_size": "21470642176",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "name": "ceph_lv1",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "tags": {
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cluster_name": "ceph",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.crush_device_class": "",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.encrypted": "0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osd_id": "1",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.type": "block",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.vdo": "0"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             },
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "type": "block",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "vg_name": "ceph_vg1"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:         }
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:     ],
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:     "2": [
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:         {
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "devices": [
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "/dev/loop5"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             ],
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_name": "ceph_lv2",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_size": "21470642176",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "name": "ceph_lv2",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "tags": {
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.cluster_name": "ceph",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.crush_device_class": "",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.encrypted": "0",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osd_id": "2",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.type": "block",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:                 "ceph.vdo": "0"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             },
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "type": "block",
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:             "vg_name": "ceph_vg2"
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:         }
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]:     ]
Dec 01 09:18:00 compute-0 flamboyant_sinoussi[109252]: }
Dec 01 09:18:00 compute-0 systemd[1]: libpod-3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2.scope: Deactivated successfully.
Dec 01 09:18:00 compute-0 podman[109184]: 2025-12-01 09:18:00.541997146 +0000 UTC m=+0.940431081 container died 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:18:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-ef126a3aab5a2f87974a167fa5f64ac43bb71f0c9fbef155d675d34b54d65d84-merged.mount: Deactivated successfully.
Dec 01 09:18:00 compute-0 podman[109184]: 2025-12-01 09:18:00.601108883 +0000 UTC m=+0.999542818 container remove 3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_sinoussi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:18:00 compute-0 systemd[1]: libpod-conmon-3144f1b0e2e992601fa2ff8bf56e786c61daac7586c0e06c61b0fc9a21545ba2.scope: Deactivated successfully.
Dec 01 09:18:00 compute-0 sudo[108962]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:00 compute-0 sudo[109350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:18:00 compute-0 sudo[109350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:18:00 compute-0 sudo[109350]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:00 compute-0 sudo[109403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:18:00 compute-0 sudo[109403]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:18:00 compute-0 sudo[109403]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:00 compute-0 sudo[109452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:18:00 compute-0 sudo[109452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:18:00 compute-0 sudo[109452]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:00 compute-0 sudo[109477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:18:00 compute-0 sudo[109477]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:18:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v182: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:01 compute-0 podman[109564]: 2025-12-01 09:18:01.355346081 +0000 UTC m=+0.095622069 container create 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:18:01 compute-0 podman[109564]: 2025-12-01 09:18:01.289087761 +0000 UTC m=+0.029363839 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:18:01 compute-0 systemd[1]: Started libpod-conmon-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope.
Dec 01 09:18:01 compute-0 sudo[109628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttnysgnytlhetasbxuhsekonsxnupasb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580680.6982985-365-78824069914369/AnsiballZ_systemd.py'
Dec 01 09:18:01 compute-0 sudo[109628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:01 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:18:01 compute-0 podman[109564]: 2025-12-01 09:18:01.444571017 +0000 UTC m=+0.184847055 container init 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:18:01 compute-0 podman[109564]: 2025-12-01 09:18:01.452858493 +0000 UTC m=+0.193134491 container start 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:18:01 compute-0 hardcore_euler[109633]: 167 167
Dec 01 09:18:01 compute-0 systemd[1]: libpod-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope: Deactivated successfully.
Dec 01 09:18:01 compute-0 conmon[109633]: conmon 2f555bb1fa017585d7a8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope/container/memory.events
Dec 01 09:18:01 compute-0 podman[109564]: 2025-12-01 09:18:01.460743358 +0000 UTC m=+0.201019346 container attach 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:18:01 compute-0 podman[109564]: 2025-12-01 09:18:01.461641274 +0000 UTC m=+0.201917262 container died 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 01 09:18:01 compute-0 ceph-mon[75031]: 3.7 scrub starts
Dec 01 09:18:01 compute-0 ceph-mon[75031]: 3.7 scrub ok
Dec 01 09:18:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-c81106908f25a4f9a1dddd7354a9b5a02d63b40b985a538cb95eb5867727dddb-merged.mount: Deactivated successfully.
Dec 01 09:18:01 compute-0 podman[109564]: 2025-12-01 09:18:01.532600809 +0000 UTC m=+0.272876807 container remove 2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_euler, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:18:01 compute-0 systemd[1]: libpod-conmon-2f555bb1fa017585d7a81759b364229fa25db7d3951c626cf35d225bfd9dd893.scope: Deactivated successfully.
Dec 01 09:18:01 compute-0 python3.9[109635]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:18:01 compute-0 podman[109660]: 2025-12-01 09:18:01.726752438 +0000 UTC m=+0.051876401 container create d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:18:01 compute-0 systemd[1]: Started libpod-conmon-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope.
Dec 01 09:18:01 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 01 09:18:01 compute-0 podman[109660]: 2025-12-01 09:18:01.704378919 +0000 UTC m=+0.029502892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:18:01 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:18:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:18:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:18:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:18:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:18:01 compute-0 podman[109660]: 2025-12-01 09:18:01.823441546 +0000 UTC m=+0.148565529 container init d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:18:01 compute-0 podman[109660]: 2025-12-01 09:18:01.838871786 +0000 UTC m=+0.163995759 container start d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:18:01 compute-0 podman[109660]: 2025-12-01 09:18:01.843584271 +0000 UTC m=+0.168708274 container attach d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:18:01 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 01 09:18:01 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 01 09:18:01 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 01 09:18:02 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec 01 09:18:02 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 01 09:18:02 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec 01 09:18:02 compute-0 sudo[109628]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:02 compute-0 ceph-mon[75031]: pgmap v182: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:02 compute-0 practical_keller[109681]: {
Dec 01 09:18:02 compute-0 practical_keller[109681]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "osd_id": 0,
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "type": "bluestore"
Dec 01 09:18:02 compute-0 practical_keller[109681]:     },
Dec 01 09:18:02 compute-0 practical_keller[109681]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "osd_id": 1,
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "type": "bluestore"
Dec 01 09:18:02 compute-0 practical_keller[109681]:     },
Dec 01 09:18:02 compute-0 practical_keller[109681]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "osd_id": 2,
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:18:02 compute-0 practical_keller[109681]:         "type": "bluestore"
Dec 01 09:18:02 compute-0 practical_keller[109681]:     }
Dec 01 09:18:02 compute-0 practical_keller[109681]: }
Dec 01 09:18:02 compute-0 systemd[1]: libpod-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope: Deactivated successfully.
Dec 01 09:18:02 compute-0 systemd[1]: libpod-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope: Consumed 1.095s CPU time.
Dec 01 09:18:02 compute-0 python3.9[109860]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 01 09:18:02 compute-0 podman[109872]: 2025-12-01 09:18:02.980358192 +0000 UTC m=+0.029968026 container died d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:18:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-b8d3b2734e6ac35c6d18e38c1d6c6f82c3055efaed8f2e82295d1926b562028a-merged.mount: Deactivated successfully.
Dec 01 09:18:03 compute-0 podman[109872]: 2025-12-01 09:18:03.045686366 +0000 UTC m=+0.095296170 container remove d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_keller, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:18:03 compute-0 systemd[1]: libpod-conmon-d64d24eb86f12b177ac9daa02ae45ce4d9dd7b2c8d446957bb957ead7152e684.scope: Deactivated successfully.
Dec 01 09:18:03 compute-0 sudo[109477]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:18:03 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:18:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v183: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:18:03 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:18:03 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 01fcc80d-6709-44b6-be41-3e809551f01e does not exist
Dec 01 09:18:03 compute-0 sudo[109911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:18:03 compute-0 sudo[109911]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:18:03 compute-0 sudo[109911]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:03 compute-0 sudo[109936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:18:03 compute-0 sudo[109936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:18:03 compute-0 sudo[109936]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:03 compute-0 ceph-mon[75031]: 6.e scrub starts
Dec 01 09:18:03 compute-0 ceph-mon[75031]: 6.e scrub ok
Dec 01 09:18:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:18:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:18:04 compute-0 ceph-mon[75031]: pgmap v183: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:05 compute-0 sudo[110086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjjskwokpuqhbcllqfkjcgsylfwfegxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580684.697381-422-128638606449594/AnsiballZ_systemd.py'
Dec 01 09:18:05 compute-0 sudo[110086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v184: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec 01 09:18:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec 01 09:18:05 compute-0 python3.9[110088]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:18:05 compute-0 sudo[110086]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:06 compute-0 sudo[110240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmutxtxxjeqkmpogyarjvbhaktmkgjzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580685.641203-422-78671070059035/AnsiballZ_systemd.py'
Dec 01 09:18:06 compute-0 sudo[110240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:06 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub starts
Dec 01 09:18:06 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub ok
Dec 01 09:18:06 compute-0 python3.9[110242]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:18:06 compute-0 sudo[110240]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:06 compute-0 ceph-mon[75031]: pgmap v184: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:06 compute-0 ceph-mon[75031]: 4.9 scrub starts
Dec 01 09:18:06 compute-0 ceph-mon[75031]: 4.9 scrub ok
Dec 01 09:18:06 compute-0 sshd-session[102077]: Connection closed by 192.168.122.30 port 34252
Dec 01 09:18:06 compute-0 sshd-session[102074]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:18:06 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Dec 01 09:18:06 compute-0 systemd[1]: session-35.scope: Consumed 1min 12.517s CPU time.
Dec 01 09:18:06 compute-0 systemd-logind[788]: Session 35 logged out. Waiting for processes to exit.
Dec 01 09:18:06 compute-0 systemd-logind[788]: Removed session 35.
Dec 01 09:18:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v185: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:07 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec 01 09:18:07 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec 01 09:18:07 compute-0 ceph-mon[75031]: 4.8 deep-scrub starts
Dec 01 09:18:07 compute-0 ceph-mon[75031]: 4.8 deep-scrub ok
Dec 01 09:18:08 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec 01 09:18:08 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec 01 09:18:08 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec 01 09:18:08 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec 01 09:18:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:08 compute-0 ceph-mon[75031]: pgmap v185: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:08 compute-0 ceph-mon[75031]: 6.17 scrub starts
Dec 01 09:18:08 compute-0 ceph-mon[75031]: 6.17 scrub ok
Dec 01 09:18:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v186: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:09 compute-0 ceph-mon[75031]: 4.14 scrub starts
Dec 01 09:18:09 compute-0 ceph-mon[75031]: 4.14 scrub ok
Dec 01 09:18:09 compute-0 ceph-mon[75031]: 7.1a scrub starts
Dec 01 09:18:09 compute-0 ceph-mon[75031]: 7.1a scrub ok
Dec 01 09:18:10 compute-0 ceph-mon[75031]: pgmap v186: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v187: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:11 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec 01 09:18:11 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec 01 09:18:11 compute-0 ceph-mon[75031]: 4.12 scrub starts
Dec 01 09:18:11 compute-0 ceph-mon[75031]: 4.12 scrub ok
Dec 01 09:18:12 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 01 09:18:12 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 01 09:18:12 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec 01 09:18:12 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec 01 09:18:12 compute-0 sshd-session[110269]: Accepted publickey for zuul from 192.168.122.30 port 55948 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:18:12 compute-0 systemd-logind[788]: New session 36 of user zuul.
Dec 01 09:18:12 compute-0 systemd[1]: Started Session 36 of User zuul.
Dec 01 09:18:12 compute-0 sshd-session[110269]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:18:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:18:12
Dec 01 09:18:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:18:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:18:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['images', 'cephfs.cephfs.meta', 'vms', '.mgr', 'volumes', 'cephfs.cephfs.data', 'backups']
Dec 01 09:18:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:18:13 compute-0 ceph-mon[75031]: pgmap v187: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:13 compute-0 ceph-mon[75031]: 4.10 scrub starts
Dec 01 09:18:13 compute-0 ceph-mon[75031]: 4.10 scrub ok
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:18:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v188: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:13 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec 01 09:18:13 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec 01 09:18:13 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec 01 09:18:13 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec 01 09:18:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:13 compute-0 python3.9[110422]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:18:14 compute-0 ceph-mon[75031]: 7.1 scrub starts
Dec 01 09:18:14 compute-0 ceph-mon[75031]: 7.1 scrub ok
Dec 01 09:18:14 compute-0 ceph-mon[75031]: 6.2 scrub starts
Dec 01 09:18:14 compute-0 ceph-mon[75031]: 6.2 scrub ok
Dec 01 09:18:14 compute-0 sudo[110576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbluqjoxuxruvqoibdlgstgytdgqasia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580694.225-36-201904214562529/AnsiballZ_getent.py'
Dec 01 09:18:14 compute-0 sudo[110576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:14 compute-0 python3.9[110578]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 01 09:18:14 compute-0 sudo[110576]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:15 compute-0 ceph-mon[75031]: pgmap v188: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:15 compute-0 ceph-mon[75031]: 3.1d scrub starts
Dec 01 09:18:15 compute-0 ceph-mon[75031]: 3.1d scrub ok
Dec 01 09:18:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v189: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:15 compute-0 sudo[110729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfammredjugzrmzxocvomkuxmpkxfhaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580695.17749-48-186333638471447/AnsiballZ_setup.py'
Dec 01 09:18:15 compute-0 sudo[110729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:15 compute-0 python3.9[110731]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:18:15 compute-0 sudo[110729]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:16 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec 01 09:18:16 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec 01 09:18:16 compute-0 sudo[110813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjuujdrdvbzgaierprjflysqwhvhhudu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580695.17749-48-186333638471447/AnsiballZ_dnf.py'
Dec 01 09:18:16 compute-0 sudo[110813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:16 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec 01 09:18:16 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec 01 09:18:16 compute-0 python3.9[110815]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:18:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v190: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:17 compute-0 ceph-mon[75031]: pgmap v189: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:17 compute-0 ceph-mon[75031]: 5.11 scrub starts
Dec 01 09:18:17 compute-0 ceph-mon[75031]: 5.11 scrub ok
Dec 01 09:18:17 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec 01 09:18:17 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec 01 09:18:17 compute-0 sudo[110813]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:18 compute-0 ceph-mon[75031]: 3.5 scrub starts
Dec 01 09:18:18 compute-0 ceph-mon[75031]: 3.5 scrub ok
Dec 01 09:18:18 compute-0 ceph-mon[75031]: pgmap v190: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:18 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub starts
Dec 01 09:18:18 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub ok
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:18:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:18:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:18 compute-0 sudo[110966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvkxvvnmlwslqzswswmanxkekhtktooj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580698.2395344-62-60050101627007/AnsiballZ_dnf.py'
Dec 01 09:18:18 compute-0 sudo[110966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:18 compute-0 python3.9[110968]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:18:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v191: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:19 compute-0 ceph-mon[75031]: 3.1e scrub starts
Dec 01 09:18:19 compute-0 ceph-mon[75031]: 3.1e scrub ok
Dec 01 09:18:19 compute-0 ceph-mon[75031]: 2.1b deep-scrub starts
Dec 01 09:18:19 compute-0 ceph-mon[75031]: 2.1b deep-scrub ok
Dec 01 09:18:20 compute-0 sudo[110966]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:20 compute-0 ceph-mon[75031]: pgmap v191: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:20 compute-0 sudo[111119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfwveuzdvobcdxrzizdhtqdbjchagjoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580700.3131323-70-125402858960996/AnsiballZ_systemd.py'
Dec 01 09:18:20 compute-0 sudo[111119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v192: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:21 compute-0 python3.9[111121]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:18:21 compute-0 sudo[111119]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:22 compute-0 python3.9[111274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:18:22 compute-0 ceph-mon[75031]: pgmap v192: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:22 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 01 09:18:22 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 01 09:18:22 compute-0 sudo[111424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbvpvaroomopsdjrrolhynytnezvxkoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580702.2146606-88-13017712281633/AnsiballZ_sefcontext.py'
Dec 01 09:18:22 compute-0 sudo[111424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:22 compute-0 python3.9[111426]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 01 09:18:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v193: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:23 compute-0 sudo[111424]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:23 compute-0 python3.9[111576]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:18:24 compute-0 ceph-mon[75031]: 7.e scrub starts
Dec 01 09:18:24 compute-0 ceph-mon[75031]: 7.e scrub ok
Dec 01 09:18:24 compute-0 ceph-mon[75031]: pgmap v193: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:24 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec 01 09:18:24 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec 01 09:18:24 compute-0 sudo[111732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyftcrgcstqbjixurnnajlslwierlwbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580704.2980936-106-90250170208431/AnsiballZ_dnf.py'
Dec 01 09:18:24 compute-0 sudo[111732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:24 compute-0 python3.9[111734]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:18:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v194: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:25 compute-0 ceph-mon[75031]: 6.1d scrub starts
Dec 01 09:18:25 compute-0 ceph-mon[75031]: 6.1d scrub ok
Dec 01 09:18:26 compute-0 sudo[111732]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:26 compute-0 ceph-mon[75031]: pgmap v194: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec 01 09:18:26 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Dec 01 09:18:26 compute-0 sudo[111885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuhcqskhkpimqwulygvrljtkycxfbmxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580706.347556-114-127359145271860/AnsiballZ_command.py'
Dec 01 09:18:26 compute-0 sudo[111885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:26 compute-0 python3.9[111887]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:18:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v195: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:27 compute-0 ceph-mon[75031]: 6.1c scrub starts
Dec 01 09:18:27 compute-0 ceph-mon[75031]: 6.1c scrub ok
Dec 01 09:18:27 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 01 09:18:27 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 01 09:18:27 compute-0 sudo[111885]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:28 compute-0 ceph-mon[75031]: pgmap v195: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:28 compute-0 ceph-mon[75031]: 6.4 scrub starts
Dec 01 09:18:28 compute-0 ceph-mon[75031]: 6.4 scrub ok
Dec 01 09:18:28 compute-0 sudo[112172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvokkppntwcijppfhkabpthgduakvshl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580707.8239784-122-44031216894714/AnsiballZ_file.py'
Dec 01 09:18:28 compute-0 sudo[112172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:28 compute-0 python3.9[112174]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 09:18:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:28 compute-0 sudo[112172]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v196: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:29 compute-0 python3.9[112324]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:18:29 compute-0 sudo[112476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abbgwupcgfrpbnpuzemaqywtstgabcmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580709.3858442-138-65126886113594/AnsiballZ_dnf.py'
Dec 01 09:18:29 compute-0 sudo[112476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:29 compute-0 python3.9[112478]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:18:30 compute-0 ceph-mon[75031]: pgmap v196: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v197: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:31 compute-0 sudo[112476]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:32 compute-0 sudo[112629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jylpxxguelcbedudqgmlqqrvgvzamqim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580711.770879-147-234674351655182/AnsiballZ_dnf.py'
Dec 01 09:18:32 compute-0 sudo[112629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:32 compute-0 ceph-mon[75031]: pgmap v197: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:32 compute-0 python3.9[112631]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:18:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v198: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:33 compute-0 sudo[112629]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:34 compute-0 ceph-mon[75031]: pgmap v198: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:34 compute-0 sudo[112782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qybawxhjhymczyypizutwbodmrwaybln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580714.0349784-159-98080545693914/AnsiballZ_stat.py'
Dec 01 09:18:34 compute-0 sudo[112782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:34 compute-0 python3.9[112784]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:18:34 compute-0 sudo[112782]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:35 compute-0 sudo[112936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jszgkxabfbqqrhwxravyhrwsvgxikrta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580714.6623406-167-271233383063092/AnsiballZ_slurp.py'
Dec 01 09:18:35 compute-0 sudo[112936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v199: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:35 compute-0 python3.9[112938]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec 01 09:18:35 compute-0 sudo[112936]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:36 compute-0 sshd-session[110272]: Connection closed by 192.168.122.30 port 55948
Dec 01 09:18:36 compute-0 sshd-session[110269]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:18:36 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Dec 01 09:18:36 compute-0 systemd[1]: session-36.scope: Consumed 17.930s CPU time.
Dec 01 09:18:36 compute-0 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Dec 01 09:18:36 compute-0 systemd-logind[788]: Removed session 36.
Dec 01 09:18:36 compute-0 ceph-mon[75031]: pgmap v199: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v200: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:38 compute-0 ceph-mon[75031]: pgmap v200: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v201: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:40 compute-0 ceph-mon[75031]: pgmap v201: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v202: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:41 compute-0 sshd-session[112963]: Accepted publickey for zuul from 192.168.122.30 port 42596 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:18:41 compute-0 systemd-logind[788]: New session 37 of user zuul.
Dec 01 09:18:41 compute-0 systemd[1]: Started Session 37 of User zuul.
Dec 01 09:18:41 compute-0 sshd-session[112963]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:18:42 compute-0 python3.9[113116]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:18:42 compute-0 ceph-mon[75031]: pgmap v202: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:18:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:18:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:18:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:18:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:18:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:18:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v203: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:43 compute-0 python3.9[113270]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:18:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:44 compute-0 python3.9[113463]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:18:45 compute-0 sshd-session[112966]: Connection closed by 192.168.122.30 port 42596
Dec 01 09:18:45 compute-0 sshd-session[112963]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:18:45 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Dec 01 09:18:45 compute-0 systemd[1]: session-37.scope: Consumed 2.533s CPU time.
Dec 01 09:18:45 compute-0 systemd-logind[788]: Session 37 logged out. Waiting for processes to exit.
Dec 01 09:18:45 compute-0 ceph-mon[75031]: pgmap v203: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:45 compute-0 systemd-logind[788]: Removed session 37.
Dec 01 09:18:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v204: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:47 compute-0 ceph-mon[75031]: pgmap v204: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v205: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:48 compute-0 ceph-mon[75031]: pgmap v205: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v206: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:50 compute-0 ceph-mon[75031]: pgmap v206: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:50 compute-0 sshd-session[113490]: Accepted publickey for zuul from 192.168.122.30 port 58494 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:18:50 compute-0 systemd-logind[788]: New session 38 of user zuul.
Dec 01 09:18:50 compute-0 systemd[1]: Started Session 38 of User zuul.
Dec 01 09:18:50 compute-0 sshd-session[113490]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:18:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v207: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:51 compute-0 python3.9[113643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:18:52 compute-0 ceph-mon[75031]: pgmap v207: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:52 compute-0 python3.9[113797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:18:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v208: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:53 compute-0 sudo[113951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wihatsrmgnwpedawywgzsrajcgrrwuzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580733.2134092-40-69553008863639/AnsiballZ_setup.py'
Dec 01 09:18:53 compute-0 sudo[113951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:53 compute-0 python3.9[113953]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:18:54 compute-0 sudo[113951]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:54 compute-0 ceph-mon[75031]: pgmap v208: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:54 compute-0 sudo[114035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljnjkhdsuwcanmldxunqwdbfbrgfliue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580733.2134092-40-69553008863639/AnsiballZ_dnf.py'
Dec 01 09:18:54 compute-0 sudo[114035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:54 compute-0 python3.9[114037]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:18:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v209: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:56 compute-0 sudo[114035]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:56 compute-0 ceph-mon[75031]: pgmap v209: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:56 compute-0 sudo[114188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtnhbsfvmgfuprngtyvlloxjlyrfajwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580736.2478144-52-244151037834303/AnsiballZ_setup.py'
Dec 01 09:18:56 compute-0 sudo[114188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:56 compute-0 python3.9[114190]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:18:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v210: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:57 compute-0 sudo[114188]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:58 compute-0 sudo[114383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfbrxdtgejvgecgqgjkwbpfteufbdjbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580737.4387627-63-241850619245714/AnsiballZ_file.py'
Dec 01 09:18:58 compute-0 sudo[114383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:58 compute-0 python3.9[114385]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:18:58 compute-0 sudo[114383]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:58 compute-0 ceph-mon[75031]: pgmap v210: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:18:58 compute-0 sudo[114535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fajgtpoktrlcxwxgbuzinmwbzxncgulh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580738.37819-71-250224185910568/AnsiballZ_command.py'
Dec 01 09:18:58 compute-0 sudo[114535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:58 compute-0 python3.9[114537]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:18:59 compute-0 sudo[114535]: pam_unix(sudo:session): session closed for user root
Dec 01 09:18:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v211: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:18:59 compute-0 sudo[114700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gllvzhhhlvkldfddwygwbiuatrwaawnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580739.1985135-79-120398439071479/AnsiballZ_stat.py'
Dec 01 09:18:59 compute-0 sudo[114700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:18:59 compute-0 python3.9[114702]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:18:59 compute-0 sudo[114700]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:00 compute-0 sudo[114778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loqibtolqssbmqlbkysdyfhclxhhyzej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580739.1985135-79-120398439071479/AnsiballZ_file.py'
Dec 01 09:19:00 compute-0 sudo[114778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:00 compute-0 python3.9[114780]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:00 compute-0 sudo[114778]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:00 compute-0 ceph-mon[75031]: pgmap v211: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:00 compute-0 sudo[114930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pimtkgokhpbyhxxtacomeytxeyzwubeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580740.489718-91-234950474991165/AnsiballZ_stat.py'
Dec 01 09:19:00 compute-0 sudo[114930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:00 compute-0 python3.9[114932]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:00 compute-0 sudo[114930]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v212: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:01 compute-0 sudo[115008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmqunacyzjqyimcjrfahtefwgbqqvmsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580740.489718-91-234950474991165/AnsiballZ_file.py'
Dec 01 09:19:01 compute-0 sudo[115008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:01 compute-0 python3.9[115010]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:01 compute-0 sudo[115008]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:02 compute-0 sudo[115160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esviraglzpkylzcmarpmblbvtkccaahl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580741.6112287-104-47078772729497/AnsiballZ_ini_file.py'
Dec 01 09:19:02 compute-0 sudo[115160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:02 compute-0 python3.9[115162]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:02 compute-0 sudo[115160]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:02 compute-0 ceph-mon[75031]: pgmap v212: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:02 compute-0 sudo[115312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbbdfiohescimpdbtelsmmtixxywnjlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580742.3927486-104-91562218362669/AnsiballZ_ini_file.py'
Dec 01 09:19:02 compute-0 sudo[115312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:02 compute-0 python3.9[115314]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:02 compute-0 sudo[115312]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v213: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:03 compute-0 sudo[115485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swujrmomvuqcpewcvrczxcfuijsjzjsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580743.0322056-104-85561976511269/AnsiballZ_ini_file.py'
Dec 01 09:19:03 compute-0 sudo[115485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:03 compute-0 sudo[115444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:03 compute-0 sudo[115444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:03 compute-0 sudo[115444]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:03 compute-0 sudo[115492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:19:03 compute-0 sudo[115492]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:03 compute-0 sudo[115492]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:03 compute-0 sudo[115517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:03 compute-0 sudo[115517]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:03 compute-0 sudo[115517]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:03 compute-0 python3.9[115489]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:03 compute-0 sudo[115542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:19:03 compute-0 sudo[115542]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:03 compute-0 sudo[115485]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:03 compute-0 sudo[115738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhnkcjzdspihqlitqygstjifncwxdvqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580743.6822937-104-122185226619760/AnsiballZ_ini_file.py'
Dec 01 09:19:03 compute-0 sudo[115738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:04 compute-0 sudo[115542]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:19:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:19:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:19:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:19:04 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 0fcfb794-f73b-477c-a5ce-19a94a36b49b does not exist
Dec 01 09:19:04 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 4021f264-8bed-4771-a488-a27a1e1c94a5 does not exist
Dec 01 09:19:04 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 6baee5d1-4cc6-43fe-a6f2-2dd304f92091 does not exist
Dec 01 09:19:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:19:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:19:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:19:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:19:04 compute-0 sudo[115750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:04 compute-0 sudo[115750]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:04 compute-0 sudo[115750]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:04 compute-0 python3.9[115747]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:04 compute-0 sudo[115775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:19:04 compute-0 sudo[115775]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:04 compute-0 sudo[115775]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:04 compute-0 sudo[115738]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:04 compute-0 sudo[115800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:04 compute-0 sudo[115800]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:04 compute-0 sudo[115800]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:04 compute-0 sudo[115849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:19:04 compute-0 sudo[115849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:04 compute-0 ceph-mon[75031]: pgmap v213: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:19:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:19:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:19:04 compute-0 podman[115990]: 2025-12-01 09:19:04.618244384 +0000 UTC m=+0.043629910 container create e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:19:04 compute-0 systemd[76658]: Created slice User Background Tasks Slice.
Dec 01 09:19:04 compute-0 systemd[76658]: Starting Cleanup of User's Temporary Files and Directories...
Dec 01 09:19:04 compute-0 systemd[1]: Started libpod-conmon-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope.
Dec 01 09:19:04 compute-0 systemd[76658]: Finished Cleanup of User's Temporary Files and Directories.
Dec 01 09:19:04 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:19:04 compute-0 sudo[116061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbuzwniouzvfnffjmahxlafmrsissiqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580744.425397-135-107940821707983/AnsiballZ_dnf.py'
Dec 01 09:19:04 compute-0 sudo[116061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:04 compute-0 podman[115990]: 2025-12-01 09:19:04.599886626 +0000 UTC m=+0.025272162 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:19:04 compute-0 podman[115990]: 2025-12-01 09:19:04.706882211 +0000 UTC m=+0.132267757 container init e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:19:04 compute-0 podman[115990]: 2025-12-01 09:19:04.717402799 +0000 UTC m=+0.142788315 container start e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:19:04 compute-0 podman[115990]: 2025-12-01 09:19:04.721510549 +0000 UTC m=+0.146896095 container attach e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:19:04 compute-0 systemd[1]: libpod-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope: Deactivated successfully.
Dec 01 09:19:04 compute-0 nifty_franklin[116053]: 167 167
Dec 01 09:19:04 compute-0 conmon[116053]: conmon e2d561b52a01002b55b5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope/container/memory.events
Dec 01 09:19:04 compute-0 podman[115990]: 2025-12-01 09:19:04.72803239 +0000 UTC m=+0.153417906 container died e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:19:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-df462a973db5f2102e7c8467d4330e423558579b482123e47f23774a42bc3cc2-merged.mount: Deactivated successfully.
Dec 01 09:19:04 compute-0 podman[115990]: 2025-12-01 09:19:04.769633789 +0000 UTC m=+0.195019315 container remove e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_franklin, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:19:04 compute-0 systemd[1]: libpod-conmon-e2d561b52a01002b55b5632462e2c7b566374caa0b339eb31dc77d22798d4a84.scope: Deactivated successfully.
Dec 01 09:19:04 compute-0 python3.9[116063]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:19:04 compute-0 podman[116085]: 2025-12-01 09:19:04.950108837 +0000 UTC m=+0.054129637 container create 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:19:04 compute-0 systemd[1]: Started libpod-conmon-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope.
Dec 01 09:19:05 compute-0 podman[116085]: 2025-12-01 09:19:04.923826717 +0000 UTC m=+0.027847607 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:19:05 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:19:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:05 compute-0 podman[116085]: 2025-12-01 09:19:05.038714193 +0000 UTC m=+0.142735013 container init 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:19:05 compute-0 podman[116085]: 2025-12-01 09:19:05.048980484 +0000 UTC m=+0.153001284 container start 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:19:05 compute-0 podman[116085]: 2025-12-01 09:19:05.05259094 +0000 UTC m=+0.156611780 container attach 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:19:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v214: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:06 compute-0 hungry_wescoff[116103]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:19:06 compute-0 hungry_wescoff[116103]: --> relative data size: 1.0
Dec 01 09:19:06 compute-0 hungry_wescoff[116103]: --> All data devices are unavailable
Dec 01 09:19:06 compute-0 systemd[1]: libpod-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope: Deactivated successfully.
Dec 01 09:19:06 compute-0 systemd[1]: libpod-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope: Consumed 1.113s CPU time.
Dec 01 09:19:06 compute-0 podman[116132]: 2025-12-01 09:19:06.258852163 +0000 UTC m=+0.033035809 container died 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:19:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-353aefb266033841909c2dea13bb09560e1ccfdda1c6fcbe9316a056e8743818-merged.mount: Deactivated successfully.
Dec 01 09:19:06 compute-0 podman[116132]: 2025-12-01 09:19:06.329704669 +0000 UTC m=+0.103888315 container remove 3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_wescoff, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:19:06 compute-0 systemd[1]: libpod-conmon-3802924bfd00f42b8827ebb6046a991285ddbb5559593f4fa529b1c0be9b96f1.scope: Deactivated successfully.
Dec 01 09:19:06 compute-0 sudo[115849]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:06 compute-0 ceph-mon[75031]: pgmap v214: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:06 compute-0 sudo[116147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:06 compute-0 sudo[116147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:06 compute-0 sudo[116147]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:06 compute-0 sudo[116061]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:06 compute-0 sudo[116172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:19:06 compute-0 sudo[116172]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:06 compute-0 sudo[116172]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:06 compute-0 sudo[116209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:06 compute-0 sudo[116209]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:06 compute-0 sudo[116209]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:06 compute-0 sudo[116246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:19:06 compute-0 sudo[116246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:06 compute-0 podman[116334]: 2025-12-01 09:19:06.953249979 +0000 UTC m=+0.045133813 container create 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:19:06 compute-0 systemd[1]: Started libpod-conmon-5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401.scope.
Dec 01 09:19:07 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:19:07 compute-0 podman[116334]: 2025-12-01 09:19:06.935487769 +0000 UTC m=+0.027371643 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:19:07 compute-0 podman[116334]: 2025-12-01 09:19:07.045364618 +0000 UTC m=+0.137248492 container init 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:19:07 compute-0 podman[116334]: 2025-12-01 09:19:07.053105475 +0000 UTC m=+0.144989309 container start 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:19:07 compute-0 podman[116334]: 2025-12-01 09:19:07.057417821 +0000 UTC m=+0.149301665 container attach 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:19:07 compute-0 condescending_khorana[116401]: 167 167
Dec 01 09:19:07 compute-0 systemd[1]: libpod-5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401.scope: Deactivated successfully.
Dec 01 09:19:07 compute-0 podman[116334]: 2025-12-01 09:19:07.062202141 +0000 UTC m=+0.154085985 container died 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:19:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-1184d164b0653522a65e11bc4631e91d7e93b70056f5e382f92fb46885ec9951-merged.mount: Deactivated successfully.
Dec 01 09:19:07 compute-0 podman[116334]: 2025-12-01 09:19:07.102043059 +0000 UTC m=+0.193926913 container remove 5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:19:07 compute-0 systemd[1]: libpod-conmon-5200925d6bbb8f3787d155220a7af7e386757ce5007761589e6aa257e1934401.scope: Deactivated successfully.
Dec 01 09:19:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v215: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:07 compute-0 sudo[116471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnotlcmtwgybsmxjgoigoavqfqqwgjwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580746.8773866-146-189608097577949/AnsiballZ_setup.py'
Dec 01 09:19:07 compute-0 sudo[116471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:07 compute-0 podman[116479]: 2025-12-01 09:19:07.287017307 +0000 UTC m=+0.051635304 container create e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:19:07 compute-0 systemd[1]: Started libpod-conmon-e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039.scope.
Dec 01 09:19:07 compute-0 podman[116479]: 2025-12-01 09:19:07.264047954 +0000 UTC m=+0.028665951 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:19:07 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:19:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:07 compute-0 podman[116479]: 2025-12-01 09:19:07.392520879 +0000 UTC m=+0.157138866 container init e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:19:07 compute-0 podman[116479]: 2025-12-01 09:19:07.400688338 +0000 UTC m=+0.165306305 container start e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:19:07 compute-0 podman[116479]: 2025-12-01 09:19:07.403961544 +0000 UTC m=+0.168579511 container attach e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:19:07 compute-0 python3.9[116473]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:19:07 compute-0 sudo[116471]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:08 compute-0 sudo[116651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyvgrynpxqbbmtosaxzzbrqxqpgfpxju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580747.713429-154-37724299706600/AnsiballZ_stat.py'
Dec 01 09:19:08 compute-0 sudo[116651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:08 compute-0 python3.9[116653]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:19:08 compute-0 sudo[116651]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]: {
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:     "0": [
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:         {
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "devices": [
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "/dev/loop3"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             ],
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_name": "ceph_lv0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_size": "21470642176",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "name": "ceph_lv0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "tags": {
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cluster_name": "ceph",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.crush_device_class": "",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.encrypted": "0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osd_id": "0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.type": "block",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.vdo": "0"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             },
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "type": "block",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "vg_name": "ceph_vg0"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:         }
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:     ],
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:     "1": [
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:         {
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "devices": [
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "/dev/loop4"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             ],
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_name": "ceph_lv1",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_size": "21470642176",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "name": "ceph_lv1",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "tags": {
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cluster_name": "ceph",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.crush_device_class": "",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.encrypted": "0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osd_id": "1",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.type": "block",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.vdo": "0"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             },
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "type": "block",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "vg_name": "ceph_vg1"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:         }
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:     ],
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:     "2": [
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:         {
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "devices": [
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "/dev/loop5"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             ],
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_name": "ceph_lv2",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_size": "21470642176",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "name": "ceph_lv2",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "tags": {
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.cluster_name": "ceph",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.crush_device_class": "",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.encrypted": "0",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osd_id": "2",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.type": "block",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:                 "ceph.vdo": "0"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             },
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "type": "block",
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:             "vg_name": "ceph_vg2"
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:         }
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]:     ]
Dec 01 09:19:08 compute-0 affectionate_antonelli[116495]: }
Dec 01 09:19:08 compute-0 systemd[1]: libpod-e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039.scope: Deactivated successfully.
Dec 01 09:19:08 compute-0 podman[116479]: 2025-12-01 09:19:08.260433908 +0000 UTC m=+1.025051885 container died e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:19:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-b4742dfe1e3643d4b7d4da31c965b962bb31a4e794f6c08dcc49db142316fce7-merged.mount: Deactivated successfully.
Dec 01 09:19:08 compute-0 podman[116479]: 2025-12-01 09:19:08.334599851 +0000 UTC m=+1.099217818 container remove e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_antonelli, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:19:08 compute-0 systemd[1]: libpod-conmon-e744af5bd210f1da44b63c872213966cd5662df772ecbeac0940156f60049039.scope: Deactivated successfully.
Dec 01 09:19:08 compute-0 sudo[116246]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:08 compute-0 sudo[116693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:08 compute-0 sudo[116693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:08 compute-0 sudo[116693]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:08 compute-0 ceph-mon[75031]: pgmap v215: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:08 compute-0 sudo[116746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:19:08 compute-0 sudo[116746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:08 compute-0 sudo[116746]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:08 compute-0 sudo[116795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:08 compute-0 sudo[116795]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:08 compute-0 sudo[116795]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:08 compute-0 sudo[116842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:19:08 compute-0 sudo[116842]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:08 compute-0 sudo[116917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wirimcgtpqkxyqxmbxubfgbkkpkdopdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580748.4255219-163-126011203682668/AnsiballZ_stat.py'
Dec 01 09:19:08 compute-0 sudo[116917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:08 compute-0 python3.9[116919]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:19:08 compute-0 sudo[116917]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:08 compute-0 podman[116959]: 2025-12-01 09:19:08.984764281 +0000 UTC m=+0.038287863 container create 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:19:09 compute-0 systemd[1]: Started libpod-conmon-4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592.scope.
Dec 01 09:19:09 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:19:09 compute-0 podman[116959]: 2025-12-01 09:19:09.059722367 +0000 UTC m=+0.113246019 container init 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:19:09 compute-0 podman[116959]: 2025-12-01 09:19:08.967179636 +0000 UTC m=+0.020703238 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:19:09 compute-0 podman[116959]: 2025-12-01 09:19:09.07108005 +0000 UTC m=+0.124603632 container start 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:19:09 compute-0 podman[116959]: 2025-12-01 09:19:09.076724835 +0000 UTC m=+0.130248437 container attach 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:19:09 compute-0 silly_poincare[116998]: 167 167
Dec 01 09:19:09 compute-0 systemd[1]: libpod-4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592.scope: Deactivated successfully.
Dec 01 09:19:09 compute-0 podman[116959]: 2025-12-01 09:19:09.078418795 +0000 UTC m=+0.131942417 container died 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:19:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-45481367aa360ee87f702722db0e483f1bbf8bee9dc86bed4c14d5d19b4af8b6-merged.mount: Deactivated successfully.
Dec 01 09:19:09 compute-0 podman[116959]: 2025-12-01 09:19:09.12023686 +0000 UTC m=+0.173760442 container remove 4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_poincare, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:19:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v216: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:09 compute-0 systemd[1]: libpod-conmon-4fb5f25536bad34854191e5076dcba7aab4b0fd2b65436d55441caa4ad3a1592.scope: Deactivated successfully.
Dec 01 09:19:09 compute-0 podman[117047]: 2025-12-01 09:19:09.286349627 +0000 UTC m=+0.048613725 container create 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:19:09 compute-0 systemd[1]: Started libpod-conmon-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope.
Dec 01 09:19:09 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:19:09 compute-0 podman[117047]: 2025-12-01 09:19:09.26769569 +0000 UTC m=+0.029959818 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:19:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:19:09 compute-0 podman[117047]: 2025-12-01 09:19:09.380767383 +0000 UTC m=+0.143031511 container init 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec 01 09:19:09 compute-0 podman[117047]: 2025-12-01 09:19:09.389778637 +0000 UTC m=+0.152042735 container start 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:19:09 compute-0 podman[117047]: 2025-12-01 09:19:09.393466005 +0000 UTC m=+0.155730113 container attach 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:19:09 compute-0 sudo[117168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnuigykixbwjtaculmhizctcfxceqckd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580749.2050097-173-171775183283199/AnsiballZ_command.py'
Dec 01 09:19:09 compute-0 sudo[117168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:09 compute-0 python3.9[117170]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:19:09 compute-0 sudo[117168]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:10 compute-0 ceph-mon[75031]: pgmap v216: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]: {
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "osd_id": 0,
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "type": "bluestore"
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:     },
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "osd_id": 1,
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "type": "bluestore"
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:     },
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "osd_id": 2,
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:         "type": "bluestore"
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]:     }
Dec 01 09:19:10 compute-0 busy_mcclintock[117111]: }
Dec 01 09:19:10 compute-0 sudo[117349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-megqupdkqeolbflokzjphxkobmqyptxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580750.0253549-183-21421339103344/AnsiballZ_service_facts.py'
Dec 01 09:19:10 compute-0 sudo[117349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:10 compute-0 systemd[1]: libpod-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope: Deactivated successfully.
Dec 01 09:19:10 compute-0 systemd[1]: libpod-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope: Consumed 1.117s CPU time.
Dec 01 09:19:10 compute-0 podman[117047]: 2025-12-01 09:19:10.507704022 +0000 UTC m=+1.269968120 container died 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:19:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-b2d34f8e41fa0af715a9e6fc58726bed2dd599fe8dd93505ce90d5240b2ea45b-merged.mount: Deactivated successfully.
Dec 01 09:19:10 compute-0 podman[117047]: 2025-12-01 09:19:10.590458787 +0000 UTC m=+1.352722875 container remove 29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_mcclintock, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:19:10 compute-0 systemd[1]: libpod-conmon-29b739b0b3d1c4c179519b277c77bb71a4f1ff418da6d4658f6bae39c3de3b1e.scope: Deactivated successfully.
Dec 01 09:19:10 compute-0 sudo[116842]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:19:10 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:19:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:19:10 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:19:10 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev e46bacb6-851c-489a-8cc1-7f07d16ef9ea does not exist
Dec 01 09:19:10 compute-0 python3.9[117351]: ansible-service_facts Invoked
Dec 01 09:19:10 compute-0 sudo[117364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:19:10 compute-0 sudo[117364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:10 compute-0 sudo[117364]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:10 compute-0 network[117418]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:19:10 compute-0 network[117424]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:19:10 compute-0 network[117428]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:19:10 compute-0 sudo[117395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:19:10 compute-0 sudo[117395]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:19:10 compute-0 sudo[117395]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v217: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:11 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:19:11 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:19:12 compute-0 ceph-mon[75031]: pgmap v217: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:19:12
Dec 01 09:19:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:19:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:19:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'volumes', 'images', 'cephfs.cephfs.data', '.mgr', 'vms']
Dec 01 09:19:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:19:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v218: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:13 compute-0 sudo[117349]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:14 compute-0 ceph-mon[75031]: pgmap v218: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:14 compute-0 sudo[117715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyicocaefmcvropnjqmeolludidjtcnc ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764580754.5495243-198-136527967087250/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764580754.5495243-198-136527967087250/args'
Dec 01 09:19:14 compute-0 sudo[117715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:15 compute-0 sudo[117715]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v219: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:15 compute-0 sudo[117882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnjkzvjcueswlglufxwtpwbsnfwdwuth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580755.2478395-209-51559936422691/AnsiballZ_dnf.py'
Dec 01 09:19:15 compute-0 sudo[117882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:15 compute-0 python3.9[117884]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:19:16 compute-0 ceph-mon[75031]: pgmap v219: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v220: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:17 compute-0 sudo[117882]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:18 compute-0 sudo[118035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxbtqfwtorgokligpbnikkdgsfzvschm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580757.6035123-222-127974105585884/AnsiballZ_package_facts.py'
Dec 01 09:19:18 compute-0 sudo[118035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:19:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:19:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:18 compute-0 python3.9[118037]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 01 09:19:18 compute-0 sudo[118035]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:18 compute-0 ceph-mon[75031]: pgmap v220: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v221: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:19 compute-0 sudo[118187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtgccztshtvtndgyrirrhflwgrekjmlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580759.182206-232-160594747310554/AnsiballZ_stat.py'
Dec 01 09:19:19 compute-0 sudo[118187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:19 compute-0 python3.9[118189]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:19 compute-0 sudo[118187]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:20 compute-0 sudo[118265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvgmeyznkfiokvytjnxcoiiryekfvtcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580759.182206-232-160594747310554/AnsiballZ_file.py'
Dec 01 09:19:20 compute-0 sudo[118265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:20 compute-0 python3.9[118267]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:20 compute-0 sudo[118265]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:20 compute-0 ceph-mon[75031]: pgmap v221: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:20 compute-0 sudo[118417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqrvmqrxtyknbyiivbcwqpfrxdhrgiaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580760.4682667-244-106007599537969/AnsiballZ_stat.py'
Dec 01 09:19:20 compute-0 sudo[118417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:21 compute-0 python3.9[118419]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:21 compute-0 sudo[118417]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v222: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:21 compute-0 sudo[118495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dexujhtdykqcecrzzopqssweakhgbvrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580760.4682667-244-106007599537969/AnsiballZ_file.py'
Dec 01 09:19:21 compute-0 sudo[118495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:21 compute-0 python3.9[118497]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:21 compute-0 sudo[118495]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:22 compute-0 sudo[118647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjsjuyyxzkuwfxwtitkmomdswertgbyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580762.0932329-262-167972885842861/AnsiballZ_lineinfile.py'
Dec 01 09:19:22 compute-0 sudo[118647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:22 compute-0 python3.9[118649]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:22 compute-0 sudo[118647]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:22 compute-0 ceph-mon[75031]: pgmap v222: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v223: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:23 compute-0 sudo[118800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-calwphlqnsemeakasfrylhmnmkanoege ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580763.3049853-277-114901017485597/AnsiballZ_setup.py'
Dec 01 09:19:23 compute-0 sudo[118800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:23 compute-0 python3.9[118802]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:19:24 compute-0 sudo[118800]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:24 compute-0 sudo[118884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phqqehpslqpjmaxbbonjbyttlzpiumxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580763.3049853-277-114901017485597/AnsiballZ_systemd.py'
Dec 01 09:19:24 compute-0 sudo[118884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:24 compute-0 ceph-mon[75031]: pgmap v223: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:25 compute-0 python3.9[118886]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:19:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v224: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:25 compute-0 sudo[118884]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:25 compute-0 sshd-session[113493]: Connection closed by 192.168.122.30 port 58494
Dec 01 09:19:25 compute-0 sshd-session[113490]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:19:25 compute-0 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Dec 01 09:19:25 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Dec 01 09:19:25 compute-0 systemd[1]: session-38.scope: Consumed 25.377s CPU time.
Dec 01 09:19:25 compute-0 systemd-logind[788]: Removed session 38.
Dec 01 09:19:26 compute-0 ceph-mon[75031]: pgmap v224: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v225: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:28 compute-0 ceph-mon[75031]: pgmap v225: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v226: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:30 compute-0 ceph-mon[75031]: pgmap v226: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v227: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:31 compute-0 sshd-session[118913]: Accepted publickey for zuul from 192.168.122.30 port 49474 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:19:31 compute-0 systemd-logind[788]: New session 39 of user zuul.
Dec 01 09:19:31 compute-0 systemd[1]: Started Session 39 of User zuul.
Dec 01 09:19:31 compute-0 sshd-session[118913]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:19:32 compute-0 sudo[119066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwsciufffyhvohrqjetzuvpmblmwtajh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580772.0829935-22-12454349621337/AnsiballZ_file.py'
Dec 01 09:19:32 compute-0 sudo[119066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:32 compute-0 python3.9[119068]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:32 compute-0 sudo[119066]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:32 compute-0 ceph-mon[75031]: pgmap v227: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v228: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:33 compute-0 sudo[119218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxqsscldgnucbcfnhwdxjsinpcplkwlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580773.0227473-34-156199887239949/AnsiballZ_stat.py'
Dec 01 09:19:33 compute-0 sudo[119218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:33 compute-0 python3.9[119220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:33 compute-0 sudo[119218]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:34 compute-0 sudo[119296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emjxcsfvfgtbvqhbncilbglupxxiroll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580773.0227473-34-156199887239949/AnsiballZ_file.py'
Dec 01 09:19:34 compute-0 sudo[119296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:34 compute-0 python3.9[119298]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:34 compute-0 sudo[119296]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:34 compute-0 sshd-session[118916]: Connection closed by 192.168.122.30 port 49474
Dec 01 09:19:34 compute-0 sshd-session[118913]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:19:34 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Dec 01 09:19:34 compute-0 systemd[1]: session-39.scope: Consumed 1.729s CPU time.
Dec 01 09:19:34 compute-0 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Dec 01 09:19:34 compute-0 systemd-logind[788]: Removed session 39.
Dec 01 09:19:34 compute-0 ceph-mon[75031]: pgmap v228: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v229: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:36 compute-0 ceph-mon[75031]: pgmap v229: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v230: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:38 compute-0 ceph-mon[75031]: pgmap v230: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v231: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:39 compute-0 sshd-session[119323]: Accepted publickey for zuul from 192.168.122.30 port 57190 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:19:39 compute-0 systemd-logind[788]: New session 40 of user zuul.
Dec 01 09:19:39 compute-0 systemd[1]: Started Session 40 of User zuul.
Dec 01 09:19:39 compute-0 sshd-session[119323]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:19:40 compute-0 ceph-mon[75031]: pgmap v231: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:41 compute-0 python3.9[119476]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:19:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v232: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:42 compute-0 sudo[119630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpfnedhwxweowizaheinploxaaclsinv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580781.6265328-33-182420448468861/AnsiballZ_file.py'
Dec 01 09:19:42 compute-0 sudo[119630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:42 compute-0 python3.9[119632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:42 compute-0 sudo[119630]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:42 compute-0 ceph-mon[75031]: pgmap v232: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:19:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:19:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:19:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:19:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:19:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:19:43 compute-0 sudo[119805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utmphiwiymwaawelflfrfzexwzilqlgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580782.5759494-41-50742574445564/AnsiballZ_stat.py'
Dec 01 09:19:43 compute-0 sudo[119805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v233: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:43 compute-0 python3.9[119807]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:43 compute-0 sudo[119805]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:43 compute-0 sudo[119883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkpfavtojuwpofnbnlefocmzlbhsivmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580782.5759494-41-50742574445564/AnsiballZ_file.py'
Dec 01 09:19:43 compute-0 sudo[119883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:43 compute-0 python3.9[119885]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.7heh9s99 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:43 compute-0 sudo[119883]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:44 compute-0 sudo[120035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iousudkimynucwuipefvpruwczzqhtaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580784.2818-61-91834086077899/AnsiballZ_stat.py'
Dec 01 09:19:44 compute-0 sudo[120035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:44 compute-0 python3.9[120037]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:44 compute-0 sudo[120035]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:44 compute-0 ceph-mon[75031]: pgmap v233: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:45 compute-0 sudo[120113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdjebcgaefhqrpfbchezvgopavgmkqhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580784.2818-61-91834086077899/AnsiballZ_file.py'
Dec 01 09:19:45 compute-0 sudo[120113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v234: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:45 compute-0 python3.9[120115]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.mc6gfrtd recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:45 compute-0 sudo[120113]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:45 compute-0 sudo[120265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hctxijnejbenjbhlusxpltthguxcxyzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580785.5413814-74-3027073482829/AnsiballZ_file.py'
Dec 01 09:19:45 compute-0 sudo[120265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:46 compute-0 python3.9[120267]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:46 compute-0 sudo[120265]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:46 compute-0 sudo[120417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aapbvibqlxbczfxlkbiqymkqqgfhisds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580786.2170815-82-129034188161152/AnsiballZ_stat.py'
Dec 01 09:19:46 compute-0 sudo[120417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:46 compute-0 python3.9[120419]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:46 compute-0 sudo[120417]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:46 compute-0 ceph-mon[75031]: pgmap v234: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:47 compute-0 sudo[120495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aixevfkgdxllpbmnphnjivinmmgjeulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580786.2170815-82-129034188161152/AnsiballZ_file.py'
Dec 01 09:19:47 compute-0 sudo[120495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v235: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:47 compute-0 python3.9[120497]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:47 compute-0 sudo[120495]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:47 compute-0 sudo[120647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vntjnfznufdxkuzwrxucgzbjxebmhwys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580787.4389856-82-66067863671798/AnsiballZ_stat.py'
Dec 01 09:19:47 compute-0 sudo[120647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:47 compute-0 python3.9[120649]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:47 compute-0 sudo[120647]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:48 compute-0 sudo[120725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aabkpillvbuamtizefdngkayancrrdvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580787.4389856-82-66067863671798/AnsiballZ_file.py'
Dec 01 09:19:48 compute-0 sudo[120725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:48 compute-0 python3.9[120727]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:19:48 compute-0 sudo[120725]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:49 compute-0 ceph-mon[75031]: pgmap v235: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:49 compute-0 sudo[120877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awuwhcnttsttbmcrdinxgwqhlilyjwcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580788.7684462-105-74870510557680/AnsiballZ_file.py'
Dec 01 09:19:49 compute-0 sudo[120877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v236: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:49 compute-0 python3.9[120879]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:49 compute-0 sudo[120877]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:49 compute-0 sudo[121029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aftpuqkwbiithobwrpzecjyiqknzdtzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580789.540513-113-267799046879923/AnsiballZ_stat.py'
Dec 01 09:19:49 compute-0 sudo[121029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:50 compute-0 python3.9[121031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:50 compute-0 sudo[121029]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:50 compute-0 sudo[121107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yagmnrkfnasbbjcxiuakilaavsttenoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580789.540513-113-267799046879923/AnsiballZ_file.py'
Dec 01 09:19:50 compute-0 sudo[121107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:50 compute-0 python3.9[121109]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:50 compute-0 sudo[121107]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:51 compute-0 ceph-mon[75031]: pgmap v236: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v237: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:51 compute-0 sudo[121259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnwmyfmjaylniggfmsfekcvppyjclhks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580790.815616-125-51872757193723/AnsiballZ_stat.py'
Dec 01 09:19:51 compute-0 sudo[121259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:51 compute-0 python3.9[121261]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:51 compute-0 sudo[121259]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:51 compute-0 sudo[121337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nheonoviekqizlkkvevlmxlxbpsmlotx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580790.815616-125-51872757193723/AnsiballZ_file.py'
Dec 01 09:19:51 compute-0 sudo[121337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:51 compute-0 python3.9[121339]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:52 compute-0 sudo[121337]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:53 compute-0 ceph-mon[75031]: pgmap v237: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:53 compute-0 sudo[121489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfvcfqbauuavwsktjymnwptagiycaipk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580792.4184384-137-202090787581520/AnsiballZ_systemd.py'
Dec 01 09:19:53 compute-0 sudo[121489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v238: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:53 compute-0 python3.9[121491]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:19:53 compute-0 systemd[1]: Reloading.
Dec 01 09:19:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:53 compute-0 systemd-rc-local-generator[121520]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:19:53 compute-0 systemd-sysv-generator[121524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:19:53 compute-0 sudo[121489]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:54 compute-0 sudo[121679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozcatgokklqyjzuddftbfiofvvssacfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580793.94316-145-137337514024431/AnsiballZ_stat.py'
Dec 01 09:19:54 compute-0 sudo[121679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:54 compute-0 python3.9[121681]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:54 compute-0 sudo[121679]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:54 compute-0 sudo[121757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdffblbwugznyywhkwmofysuspuaeico ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580793.94316-145-137337514024431/AnsiballZ_file.py'
Dec 01 09:19:54 compute-0 sudo[121757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:55 compute-0 python3.9[121759]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:55 compute-0 sudo[121757]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:55 compute-0 ceph-mon[75031]: pgmap v238: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v239: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:55 compute-0 sudo[121909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvoailemnyntsyvdlisnlizvfmuwfioj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580795.2295313-157-4237399925198/AnsiballZ_stat.py'
Dec 01 09:19:55 compute-0 sudo[121909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:55 compute-0 python3.9[121911]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:19:55 compute-0 sudo[121909]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:56 compute-0 sudo[121987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phfwjrxralkgbpqwioohkoqmlgxvimef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580795.2295313-157-4237399925198/AnsiballZ_file.py'
Dec 01 09:19:56 compute-0 sudo[121987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:56 compute-0 python3.9[121989]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:56 compute-0 sudo[121987]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:56 compute-0 sudo[122139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usynfxbzhncbajvjvkgxzehxzrmjigox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580796.4716163-169-268594050047628/AnsiballZ_systemd.py'
Dec 01 09:19:56 compute-0 sudo[122139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:57 compute-0 python3.9[122141]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:19:57 compute-0 ceph-mon[75031]: pgmap v239: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:57 compute-0 systemd[1]: Reloading.
Dec 01 09:19:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v240: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:57 compute-0 systemd-rc-local-generator[122170]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:19:57 compute-0 systemd-sysv-generator[122175]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:19:57 compute-0 systemd[1]: Starting Create netns directory...
Dec 01 09:19:57 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:19:57 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:19:57 compute-0 systemd[1]: Finished Create netns directory.
Dec 01 09:19:57 compute-0 sudo[122139]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:58 compute-0 python3.9[122333]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:19:58 compute-0 network[122350]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:19:58 compute-0 network[122351]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:19:58 compute-0 network[122352]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:19:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:19:59 compute-0 ceph-mon[75031]: pgmap v240: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:19:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v241: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:01 compute-0 ceph-mon[75031]: pgmap v241: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v242: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:03 compute-0 sudo[122612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diwddqwrbxtlpfjzgohwxmrubreqksdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580802.6355436-195-14373446096644/AnsiballZ_stat.py'
Dec 01 09:20:03 compute-0 sudo[122612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:03 compute-0 ceph-mon[75031]: pgmap v242: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v243: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:03 compute-0 python3.9[122614]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:03 compute-0 sudo[122612]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:03 compute-0 sudo[122690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pafwsbfxuhtxbxqhqpccthljzpmkbbvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580802.6355436-195-14373446096644/AnsiballZ_file.py'
Dec 01 09:20:03 compute-0 sudo[122690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:03 compute-0 python3.9[122692]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:03 compute-0 sudo[122690]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:04 compute-0 sudo[122842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqgiaawgmmynvrvnskoxolwjyqwyoznv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580804.020002-208-169787887576622/AnsiballZ_file.py'
Dec 01 09:20:04 compute-0 sudo[122842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:04 compute-0 python3.9[122844]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:04 compute-0 sudo[122842]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:05 compute-0 sudo[122994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyjxbuxjutxduiadhkddwhlhnviztalh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580804.6840286-216-222315856622690/AnsiballZ_stat.py'
Dec 01 09:20:05 compute-0 sudo[122994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v244: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:05 compute-0 python3.9[122996]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:05 compute-0 sudo[122994]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:05 compute-0 ceph-mon[75031]: pgmap v243: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:05 compute-0 sudo[123072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krxzclfxbdysbqoqailpunytjstkahzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580804.6840286-216-222315856622690/AnsiballZ_file.py'
Dec 01 09:20:05 compute-0 sudo[123072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:05 compute-0 python3.9[123074]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:05 compute-0 sudo[123072]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:06 compute-0 ceph-mon[75031]: pgmap v244: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:06 compute-0 sudo[123224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmvegkpcjkogjcvciamxsjdrcixuzbqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580805.9831285-231-138003154671503/AnsiballZ_timezone.py'
Dec 01 09:20:06 compute-0 sudo[123224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:06 compute-0 python3.9[123226]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 01 09:20:06 compute-0 systemd[1]: Starting Time & Date Service...
Dec 01 09:20:06 compute-0 systemd[1]: Started Time & Date Service.
Dec 01 09:20:06 compute-0 sudo[123224]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v245: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:07 compute-0 sudo[123380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfzacfprdtrznriprouvlnunoqhankmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580807.1795254-240-45693204614687/AnsiballZ_file.py'
Dec 01 09:20:07 compute-0 sudo[123380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:07 compute-0 python3.9[123382]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:07 compute-0 sudo[123380]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:08 compute-0 sudo[123532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anldmoslnosypoeudydzprtamixlahyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580807.9894671-248-95182817169132/AnsiballZ_stat.py'
Dec 01 09:20:08 compute-0 sudo[123532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:08 compute-0 ceph-mon[75031]: pgmap v245: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:08 compute-0 python3.9[123534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:08 compute-0 sudo[123532]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:08 compute-0 sudo[123610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vclcvhybidismctmmbugkwalkwtxwway ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580807.9894671-248-95182817169132/AnsiballZ_file.py'
Dec 01 09:20:08 compute-0 sudo[123610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:09 compute-0 python3.9[123612]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:09 compute-0 sudo[123610]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v246: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:09 compute-0 sudo[123762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xghawupohawxwrwzyaocbeefoehqxafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580809.3201783-260-256684672913328/AnsiballZ_stat.py'
Dec 01 09:20:09 compute-0 sudo[123762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:09 compute-0 python3.9[123764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:09 compute-0 sudo[123762]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:10 compute-0 sudo[123840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yflnrkvopcopdpdesnxawbqvgynlvbbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580809.3201783-260-256684672913328/AnsiballZ_file.py'
Dec 01 09:20:10 compute-0 sudo[123840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:10 compute-0 python3.9[123842]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fe4u1wl8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:10 compute-0 ceph-mon[75031]: pgmap v246: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:10 compute-0 sudo[123840]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:10 compute-0 sudo[123948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:10 compute-0 sudo[123948]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:10 compute-0 sudo[123948]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:10 compute-0 sudo[124034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fewjehdvllxzuoywqydfjjvropntpqbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580810.6137228-272-56831635904522/AnsiballZ_stat.py'
Dec 01 09:20:10 compute-0 sudo[124034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:10 compute-0 sudo[124001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:20:10 compute-0 sudo[124001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:10 compute-0 sudo[124001]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:10 compute-0 sudo[124045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:10 compute-0 sudo[124045]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:10 compute-0 sudo[124045]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:11 compute-0 sudo[124070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:20:11 compute-0 sudo[124070]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:11 compute-0 python3.9[124042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v247: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:11 compute-0 sudo[124034]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:11 compute-0 sudo[124188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyvbjxlajzlaowifefhubhymihqbjacx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580810.6137228-272-56831635904522/AnsiballZ_file.py'
Dec 01 09:20:11 compute-0 sudo[124188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:11 compute-0 sudo[124070]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:20:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:20:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:20:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:20:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:20:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:20:11 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev f360f47f-3d94-438a-9c40-240009d896ed does not exist
Dec 01 09:20:11 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 9254ea8c-1cab-4c87-97c9-c7db68f71411 does not exist
Dec 01 09:20:11 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 590c3f9e-41e3-4055-85ed-67f9c2c7779a does not exist
Dec 01 09:20:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:20:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:20:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:20:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:20:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:20:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:20:11 compute-0 python3.9[124192]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:11 compute-0 sudo[124188]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:11 compute-0 sudo[124205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:11 compute-0 sudo[124205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:11 compute-0 sudo[124205]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:11 compute-0 sudo[124230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:20:11 compute-0 sudo[124230]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:11 compute-0 sudo[124230]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:11 compute-0 sudo[124279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:11 compute-0 sudo[124279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:11 compute-0 sudo[124279]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:11 compute-0 sudo[124304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:20:11 compute-0 sudo[124304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:12 compute-0 podman[124422]: 2025-12-01 09:20:12.319445618 +0000 UTC m=+0.090495113 container create f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:20:12 compute-0 podman[124422]: 2025-12-01 09:20:12.255271428 +0000 UTC m=+0.026320943 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:20:12 compute-0 systemd[1]: Started libpod-conmon-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope.
Dec 01 09:20:12 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:20:12 compute-0 ceph-mon[75031]: pgmap v247: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:20:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:20:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:20:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:20:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:20:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:20:12 compute-0 podman[124422]: 2025-12-01 09:20:12.416123956 +0000 UTC m=+0.187173501 container init f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:20:12 compute-0 podman[124422]: 2025-12-01 09:20:12.425022354 +0000 UTC m=+0.196071869 container start f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:20:12 compute-0 podman[124422]: 2025-12-01 09:20:12.430032354 +0000 UTC m=+0.201081879 container attach f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:20:12 compute-0 naughty_banach[124439]: 167 167
Dec 01 09:20:12 compute-0 systemd[1]: libpod-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope: Deactivated successfully.
Dec 01 09:20:12 compute-0 conmon[124439]: conmon f6e6ae89457b70a295ef <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope/container/memory.events
Dec 01 09:20:12 compute-0 podman[124422]: 2025-12-01 09:20:12.433606762 +0000 UTC m=+0.204656287 container died f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:20:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-cce5b5e9d3886810a25a67700d3bd985a1ff3815a76fb42e8ead49f5bd94fded-merged.mount: Deactivated successfully.
Dec 01 09:20:12 compute-0 podman[124422]: 2025-12-01 09:20:12.487790452 +0000 UTC m=+0.258839957 container remove f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_banach, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:20:12 compute-0 systemd[1]: libpod-conmon-f6e6ae89457b70a295ef12f2edde1216a8c9a0e7832270f50d1d7699afcdd364.scope: Deactivated successfully.
Dec 01 09:20:12 compute-0 sudo[124530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dadtjucsfuylecdxcyrjscznszzowegj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580812.1405675-285-116362694671710/AnsiballZ_command.py'
Dec 01 09:20:12 compute-0 sudo[124530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:12 compute-0 podman[124537]: 2025-12-01 09:20:12.686765167 +0000 UTC m=+0.076916725 container create 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:20:12 compute-0 systemd[1]: Started libpod-conmon-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope.
Dec 01 09:20:12 compute-0 podman[124537]: 2025-12-01 09:20:12.658549428 +0000 UTC m=+0.048701016 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:20:12 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:20:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:12 compute-0 podman[124537]: 2025-12-01 09:20:12.790116485 +0000 UTC m=+0.180268063 container init 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:20:12 compute-0 podman[124537]: 2025-12-01 09:20:12.79890204 +0000 UTC m=+0.189053598 container start 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:20:12 compute-0 podman[124537]: 2025-12-01 09:20:12.802792517 +0000 UTC m=+0.192944095 container attach 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:20:12 compute-0 python3.9[124538]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:20:12 compute-0 sudo[124530]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:20:12
Dec 01 09:20:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:20:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:20:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.data', 'vms', 'images', 'volumes', 'cephfs.cephfs.meta', 'backups', '.mgr']
Dec 01 09:20:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:20:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v248: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:13 compute-0 sudo[124716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyqcrabloqgggscmqyjcbihxotqugalg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764580813.051584-293-6321101011104/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 09:20:13 compute-0 sudo[124716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:13 compute-0 python3[124720]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 09:20:13 compute-0 sudo[124716]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:14 compute-0 elegant_bhabha[124555]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:20:14 compute-0 elegant_bhabha[124555]: --> relative data size: 1.0
Dec 01 09:20:14 compute-0 elegant_bhabha[124555]: --> All data devices are unavailable
Dec 01 09:20:14 compute-0 systemd[1]: libpod-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope: Deactivated successfully.
Dec 01 09:20:14 compute-0 systemd[1]: libpod-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope: Consumed 1.167s CPU time.
Dec 01 09:20:14 compute-0 podman[124537]: 2025-12-01 09:20:14.04164241 +0000 UTC m=+1.431793978 container died 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:20:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-68aca2e2454b2770cf6e02cde0af8204f9eef16810bc562776ff0addb8defc75-merged.mount: Deactivated successfully.
Dec 01 09:20:14 compute-0 podman[124537]: 2025-12-01 09:20:14.106918943 +0000 UTC m=+1.497070491 container remove 4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bhabha, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:20:14 compute-0 systemd[1]: libpod-conmon-4d16158d4e6c1b9d90781c74a3deea84e0602cfd494f27192001921d939fca03.scope: Deactivated successfully.
Dec 01 09:20:14 compute-0 sudo[124304]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:14 compute-0 sudo[124828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:14 compute-0 sudo[124828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:14 compute-0 sudo[124828]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:14 compute-0 sudo[124858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:20:14 compute-0 sudo[124858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:14 compute-0 sudo[124858]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:14 compute-0 sudo[124902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:14 compute-0 sudo[124902]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:14 compute-0 sudo[124902]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:14 compute-0 sudo[124953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:20:14 compute-0 sudo[124999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kepowiztbpgvaborsyardjjbwamhdjcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580814.0901666-301-272079729492340/AnsiballZ_stat.py'
Dec 01 09:20:14 compute-0 sudo[124953]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:14 compute-0 sudo[124999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:14 compute-0 ceph-mon[75031]: pgmap v248: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:14 compute-0 python3.9[125002]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:14 compute-0 sudo[124999]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:14 compute-0 podman[125066]: 2025-12-01 09:20:14.762514243 +0000 UTC m=+0.050172210 container create d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:20:14 compute-0 systemd[1]: Started libpod-conmon-d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da.scope.
Dec 01 09:20:14 compute-0 podman[125066]: 2025-12-01 09:20:14.737349016 +0000 UTC m=+0.025006993 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:20:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:20:14 compute-0 podman[125066]: 2025-12-01 09:20:14.850150949 +0000 UTC m=+0.137808936 container init d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:20:14 compute-0 podman[125066]: 2025-12-01 09:20:14.857046036 +0000 UTC m=+0.144704003 container start d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 01 09:20:14 compute-0 podman[125066]: 2025-12-01 09:20:14.861802609 +0000 UTC m=+0.149460576 container attach d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:20:14 compute-0 festive_brahmagupta[125108]: 167 167
Dec 01 09:20:14 compute-0 systemd[1]: libpod-d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da.scope: Deactivated successfully.
Dec 01 09:20:14 compute-0 podman[125066]: 2025-12-01 09:20:14.865060187 +0000 UTC m=+0.152718154 container died d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:20:14 compute-0 sudo[125138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snnfuxckqgtcrkrpbtaevzzwtodhqksv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580814.0901666-301-272079729492340/AnsiballZ_file.py'
Dec 01 09:20:14 compute-0 sudo[125138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-b8a0cb42acf3c562e529c0261f67c26e99c71779a7411db556eaa16900e87cef-merged.mount: Deactivated successfully.
Dec 01 09:20:15 compute-0 podman[125066]: 2025-12-01 09:20:15.024464082 +0000 UTC m=+0.312122049 container remove d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_brahmagupta, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:20:15 compute-0 systemd[1]: libpod-conmon-d21031251c53d767be2ef1a4a8fafca637850ea0fc03edccb1f9a3a6ca7b07da.scope: Deactivated successfully.
Dec 01 09:20:15 compute-0 python3.9[125142]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:15 compute-0 sudo[125138]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v249: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:15 compute-0 podman[125167]: 2025-12-01 09:20:15.195658311 +0000 UTC m=+0.048917672 container create d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:20:15 compute-0 systemd[1]: Started libpod-conmon-d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5.scope.
Dec 01 09:20:15 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:20:15 compute-0 podman[125167]: 2025-12-01 09:20:15.177152505 +0000 UTC m=+0.030411886 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:20:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:15 compute-0 podman[125167]: 2025-12-01 09:20:15.288475032 +0000 UTC m=+0.141734413 container init d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:20:15 compute-0 podman[125167]: 2025-12-01 09:20:15.297121272 +0000 UTC m=+0.150380633 container start d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:20:15 compute-0 podman[125167]: 2025-12-01 09:20:15.301347359 +0000 UTC m=+0.154606720 container attach d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:20:15 compute-0 sudo[125332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faobndqgovlrlpydzxosmfsgwfgasmcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580815.281354-313-172315648257191/AnsiballZ_stat.py'
Dec 01 09:20:15 compute-0 sudo[125332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:15 compute-0 python3.9[125334]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:15 compute-0 sudo[125332]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:16 compute-0 jolly_faraday[125207]: {
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:     "0": [
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:         {
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "devices": [
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "/dev/loop3"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             ],
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_name": "ceph_lv0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_size": "21470642176",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "name": "ceph_lv0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "tags": {
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cluster_name": "ceph",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.crush_device_class": "",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.encrypted": "0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osd_id": "0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.type": "block",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.vdo": "0"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             },
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "type": "block",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "vg_name": "ceph_vg0"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:         }
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:     ],
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:     "1": [
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:         {
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "devices": [
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "/dev/loop4"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             ],
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_name": "ceph_lv1",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_size": "21470642176",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "name": "ceph_lv1",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "tags": {
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cluster_name": "ceph",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.crush_device_class": "",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.encrypted": "0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osd_id": "1",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.type": "block",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.vdo": "0"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             },
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "type": "block",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "vg_name": "ceph_vg1"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:         }
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:     ],
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:     "2": [
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:         {
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "devices": [
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "/dev/loop5"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             ],
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_name": "ceph_lv2",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_size": "21470642176",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "name": "ceph_lv2",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "tags": {
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.cluster_name": "ceph",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.crush_device_class": "",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.encrypted": "0",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osd_id": "2",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.type": "block",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:                 "ceph.vdo": "0"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             },
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "type": "block",
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:             "vg_name": "ceph_vg2"
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:         }
Dec 01 09:20:16 compute-0 jolly_faraday[125207]:     ]
Dec 01 09:20:16 compute-0 jolly_faraday[125207]: }
Dec 01 09:20:16 compute-0 systemd[1]: libpod-d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5.scope: Deactivated successfully.
Dec 01 09:20:16 compute-0 podman[125167]: 2025-12-01 09:20:16.096639401 +0000 UTC m=+0.949898822 container died d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:20:16 compute-0 sudo[125414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsyvcjbpzdkurzwzwfxxrvmvbatnxmou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580815.281354-313-172315648257191/AnsiballZ_file.py'
Dec 01 09:20:16 compute-0 sudo[125414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-6d3f2e9583192df983eef63b4e4f1b0eddd028131eea98f0b921ba47f1f0c7fb-merged.mount: Deactivated successfully.
Dec 01 09:20:16 compute-0 podman[125167]: 2025-12-01 09:20:16.188638188 +0000 UTC m=+1.041897579 container remove d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_faraday, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:20:16 compute-0 systemd[1]: libpod-conmon-d5bd032ba033fb4735a95f1c288d3bc663293c072bdc788f2f80488939915ef5.scope: Deactivated successfully.
Dec 01 09:20:16 compute-0 sudo[124953]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:16 compute-0 sudo[125431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:16 compute-0 sudo[125431]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:16 compute-0 sudo[125431]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:16 compute-0 python3.9[125422]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:16 compute-0 sudo[125414]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:16 compute-0 sudo[125456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:20:16 compute-0 sudo[125456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:16 compute-0 sudo[125456]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:16 compute-0 ceph-mon[75031]: pgmap v249: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:16 compute-0 sudo[125489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:16 compute-0 sudo[125489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:16 compute-0 sudo[125489]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:16 compute-0 sudo[125530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:20:16 compute-0 sudo[125530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:16 compute-0 sudo[125728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzatykankldpezpdsngtqsljtgxebytg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580816.5601146-325-99114093846070/AnsiballZ_stat.py'
Dec 01 09:20:16 compute-0 sudo[125728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:16 compute-0 podman[125701]: 2025-12-01 09:20:16.927203194 +0000 UTC m=+0.045442878 container create c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:20:16 compute-0 systemd[1]: Started libpod-conmon-c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1.scope.
Dec 01 09:20:16 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:20:17 compute-0 podman[125701]: 2025-12-01 09:20:16.90815552 +0000 UTC m=+0.026395234 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:20:17 compute-0 podman[125701]: 2025-12-01 09:20:17.011773727 +0000 UTC m=+0.130013491 container init c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:20:17 compute-0 podman[125701]: 2025-12-01 09:20:17.021762998 +0000 UTC m=+0.140002682 container start c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:20:17 compute-0 podman[125701]: 2025-12-01 09:20:17.025779489 +0000 UTC m=+0.144019273 container attach c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:20:17 compute-0 compassionate_payne[125735]: 167 167
Dec 01 09:20:17 compute-0 systemd[1]: libpod-c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1.scope: Deactivated successfully.
Dec 01 09:20:17 compute-0 podman[125701]: 2025-12-01 09:20:17.030376817 +0000 UTC m=+0.148616501 container died c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Dec 01 09:20:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-3ffaabc829525d83e468e26cf13f00583e02a221b7c5b5fcbb9c258033e16f40-merged.mount: Deactivated successfully.
Dec 01 09:20:17 compute-0 podman[125701]: 2025-12-01 09:20:17.066488893 +0000 UTC m=+0.184728577 container remove c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_payne, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:20:17 compute-0 systemd[1]: libpod-conmon-c04d035fdd64901a5b9bc0bdd8277fbf97eadf18d976880a622de861798145e1.scope: Deactivated successfully.
Dec 01 09:20:17 compute-0 python3.9[125732]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v250: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:17 compute-0 sudo[125728]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:17 compute-0 podman[125761]: 2025-12-01 09:20:17.227663761 +0000 UTC m=+0.046712106 container create dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:20:17 compute-0 systemd[1]: Started libpod-conmon-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope.
Dec 01 09:20:17 compute-0 podman[125761]: 2025-12-01 09:20:17.206560316 +0000 UTC m=+0.025608681 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:20:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:20:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:20:17 compute-0 podman[125761]: 2025-12-01 09:20:17.323175734 +0000 UTC m=+0.142224109 container init dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:20:17 compute-0 podman[125761]: 2025-12-01 09:20:17.338185306 +0000 UTC m=+0.157233681 container start dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:20:17 compute-0 podman[125761]: 2025-12-01 09:20:17.343010381 +0000 UTC m=+0.162058746 container attach dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:20:17 compute-0 sudo[125856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhuztiaocmxpgpjnrubgxstuvxeorkss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580816.5601146-325-99114093846070/AnsiballZ_file.py'
Dec 01 09:20:17 compute-0 sudo[125856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:17 compute-0 python3.9[125858]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:17 compute-0 sudo[125856]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:18 compute-0 sudo[126016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrkqmuwkcepuvmtfeefwblmkdmkashiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580817.8344486-337-13242299248962/AnsiballZ_stat.py'
Dec 01 09:20:18 compute-0 sudo[126016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:18 compute-0 python3.9[126018]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:18 compute-0 admiring_beaver[125801]: {
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "osd_id": 0,
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "type": "bluestore"
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:     },
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "osd_id": 1,
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "type": "bluestore"
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:     },
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "osd_id": 2,
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:         "type": "bluestore"
Dec 01 09:20:18 compute-0 admiring_beaver[125801]:     }
Dec 01 09:20:18 compute-0 admiring_beaver[125801]: }
Dec 01 09:20:18 compute-0 sudo[126016]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:18 compute-0 systemd[1]: libpod-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope: Deactivated successfully.
Dec 01 09:20:18 compute-0 systemd[1]: libpod-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope: Consumed 1.063s CPU time.
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:20:18 compute-0 podman[126041]: 2025-12-01 09:20:18.44342973 +0000 UTC m=+0.031968193 container died dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:20:18 compute-0 ceph-mon[75031]: pgmap v250: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-d09fbd0dd65100cfc9b462d043041bd1b849ca951a95d9f9822235e7673631aa-merged.mount: Deactivated successfully.
Dec 01 09:20:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:18 compute-0 podman[126041]: 2025-12-01 09:20:18.518654413 +0000 UTC m=+0.107192796 container remove dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_beaver, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 09:20:18 compute-0 systemd[1]: libpod-conmon-dcca6ee7e0333be56c6c075857cb279f6912d50c2bb2dc70f185762ac47312e7.scope: Deactivated successfully.
Dec 01 09:20:18 compute-0 sudo[125530]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:20:18 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:20:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:20:18 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:20:18 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev f23033d6-fc11-4a22-bb61-d739105305cd does not exist
Dec 01 09:20:18 compute-0 sudo[126129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzcxkmymkrerllejdjhlkhcdjbthpwns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580817.8344486-337-13242299248962/AnsiballZ_file.py'
Dec 01 09:20:18 compute-0 sudo[126129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:18 compute-0 sudo[126131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:20:18 compute-0 sudo[126131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:18 compute-0 sudo[126131]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:18 compute-0 sudo[126157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:20:18 compute-0 sudo[126157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:20:18 compute-0 sudo[126157]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:18 compute-0 python3.9[126134]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:18 compute-0 sudo[126129]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v251: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:19 compute-0 sudo[126331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohznurpmugoqxgsexpqhjzegntwggewx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580818.998603-349-29446016008623/AnsiballZ_stat.py'
Dec 01 09:20:19 compute-0 sudo[126331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:19 compute-0 python3.9[126333]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:19 compute-0 sudo[126331]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:20:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:20:19 compute-0 sudo[126409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpdrimunoetibmaznklnedwcbrhdewqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580818.998603-349-29446016008623/AnsiballZ_file.py'
Dec 01 09:20:19 compute-0 sudo[126409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:20 compute-0 python3.9[126411]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:20 compute-0 sudo[126409]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:20 compute-0 sudo[126561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmpaiempdarkczcxuoaaynsprvwrhkxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580820.2713218-362-209571522632828/AnsiballZ_command.py'
Dec 01 09:20:20 compute-0 sudo[126561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:20 compute-0 python3.9[126563]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:20:20 compute-0 sudo[126561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:20 compute-0 ceph-mon[75031]: pgmap v251: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v252: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:21 compute-0 sudo[126716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvramihimurttbjygqqczmrmagndpapw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580821.034498-370-4871766028619/AnsiballZ_blockinfile.py'
Dec 01 09:20:21 compute-0 sudo[126716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:21 compute-0 python3.9[126718]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:21 compute-0 sudo[126716]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:22 compute-0 sudo[126868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsrybkkpsplebbvxaahsuapqlwqdtbkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580821.9792767-379-70029183502409/AnsiballZ_file.py'
Dec 01 09:20:22 compute-0 sudo[126868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:22 compute-0 python3.9[126870]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:22 compute-0 sudo[126868]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:22 compute-0 sudo[127020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhpcrydndrwmjrscomkbpjzdjdjssxns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580822.64724-379-33112581175556/AnsiballZ_file.py'
Dec 01 09:20:22 compute-0 sudo[127020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:23 compute-0 ceph-mon[75031]: pgmap v252: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v253: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:23 compute-0 python3.9[127022]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:23 compute-0 sudo[127020]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:23 compute-0 sudo[127172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soeapgylfvdauzscabryyzrdilrtpxrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580823.435464-394-120295377667424/AnsiballZ_mount.py'
Dec 01 09:20:23 compute-0 sudo[127172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:24 compute-0 python3.9[127174]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:20:24 compute-0 sudo[127172]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:24 compute-0 sudo[127324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scmqzgbyibylrsybqmcgvmcygfkwfyco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580824.3368723-394-74272845934142/AnsiballZ_mount.py'
Dec 01 09:20:24 compute-0 sudo[127324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:24 compute-0 python3.9[127326]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:20:24 compute-0 sudo[127324]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:25 compute-0 ceph-mon[75031]: pgmap v253: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v254: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:25 compute-0 sshd-session[119326]: Connection closed by 192.168.122.30 port 57190
Dec 01 09:20:25 compute-0 sshd-session[119323]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:20:25 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Dec 01 09:20:25 compute-0 systemd[1]: session-40.scope: Consumed 34.192s CPU time.
Dec 01 09:20:25 compute-0 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Dec 01 09:20:25 compute-0 systemd-logind[788]: Removed session 40.
Dec 01 09:20:26 compute-0 ceph-mon[75031]: pgmap v254: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v255: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:28 compute-0 ceph-mon[75031]: pgmap v255: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v256: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:30 compute-0 ceph-mon[75031]: pgmap v256: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v257: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:31 compute-0 sshd-session[127352]: Accepted publickey for zuul from 192.168.122.30 port 54144 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:20:31 compute-0 systemd-logind[788]: New session 41 of user zuul.
Dec 01 09:20:31 compute-0 systemd[1]: Started Session 41 of User zuul.
Dec 01 09:20:31 compute-0 sshd-session[127352]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:20:32 compute-0 ceph-mon[75031]: pgmap v257: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:32 compute-0 sudo[127505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biqrihjyjkokccfehgzayazebpwqqyuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580831.8597782-16-160496464883683/AnsiballZ_tempfile.py'
Dec 01 09:20:32 compute-0 sudo[127505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:32 compute-0 python3.9[127507]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 01 09:20:32 compute-0 sudo[127505]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v258: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:33 compute-0 sudo[127657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjifivdqvbhpbuafiapzvinsrdifmxux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580832.8498726-28-173921002227821/AnsiballZ_stat.py'
Dec 01 09:20:33 compute-0 sudo[127657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:33 compute-0 python3.9[127659]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:20:33 compute-0 sudo[127657]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:34 compute-0 sudo[127811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxgiselordldtkzgaydspqumvbhlsgsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580833.7765489-36-78973206784745/AnsiballZ_slurp.py'
Dec 01 09:20:34 compute-0 sudo[127811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:34 compute-0 ceph-mon[75031]: pgmap v258: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:34 compute-0 python3.9[127813]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec 01 09:20:34 compute-0 sudo[127811]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:34 compute-0 sudo[127963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbjiwvjyzdznafnoocdmrgdvfurlcwwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580834.623237-44-178067481465054/AnsiballZ_stat.py'
Dec 01 09:20:34 compute-0 sudo[127963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:35 compute-0 python3.9[127965]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.in05qk5a follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:20:35 compute-0 sudo[127963]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v259: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:35 compute-0 sudo[128088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itdkillydmijjfghmuqphqxdginzptml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580834.623237-44-178067481465054/AnsiballZ_copy.py'
Dec 01 09:20:35 compute-0 sudo[128088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:35 compute-0 python3.9[128090]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.in05qk5a mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580834.623237-44-178067481465054/.source.in05qk5a _original_basename=.1ewlnk6z follow=False checksum=2242aa230a3299b7aca23dfd6feceb6f43ae540f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:35 compute-0 sudo[128088]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:36 compute-0 ceph-mon[75031]: pgmap v259: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:36 compute-0 sudo[128240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxawtrtvhnikziflycwnctxrqtyvtjzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580836.1301486-59-151660965945556/AnsiballZ_setup.py'
Dec 01 09:20:36 compute-0 sudo[128240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:36 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 09:20:37 compute-0 python3.9[128242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:20:37 compute-0 sudo[128240]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v260: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:37 compute-0 sudo[128394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fohpdgaqtbdndmxrnbzdutzjkxfrkror ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580837.3355575-68-262618453467801/AnsiballZ_blockinfile.py'
Dec 01 09:20:37 compute-0 sudo[128394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:38 compute-0 python3.9[128396]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRTxmAPcz2eFUCrQOAknLp4ibCvALuiJ7iA+ICPT8Mpd8XYcXDdZBZjlSgWd0U+d6qvFNYaJ4Kq/cNnxeSVMCkpQCGri3TTRfaS9L5COiCf0cmBNheHZSQL0uZLjKzjeaIyGWH6HdOA7KUsCK2YT/Iyf0OJzrBs5vhWuzbSXsCjsHTSzR+XxRX3C/ImHAtccLwxysUhm6H4CGIPn0bY/YGgoRkJUvouHT/4kSxhQrtFAKJOWlJ01d3tdISKrGa+SiKU6zq4yCgT5yeSsMSRyP+L06UuH7Htv2BSPXmTFLy8alJrAKLo19SllAr6m5ZP3OWy9eRDvp+oa4ZA3J9JX+isLwhjDkF1Q+aes+99JQ6E7W5hL8qvDAHCwaKgIo1IRMHJEVvZNsKqn+ME9EBDD1WyTNzik/qEOj2Cr9TXxmps8zD0VcngBAhdAv39R6EAPnVfRf1Goyagp6gPsCOeulh58jgrvAZ7L89u1J5yZY4C2Cu9js9UJwp46pdgU5qDDM=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILCzsFh+ZK0hqueDU2gWvb+j6m7hD/RYc8+thzHnJPmj
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN8lUi9ZvyyCZ7KdPvA7WBYtjDR8VhQzZuiukEvvpoRp0UJKIzVf11cXzP5sRkLnexUeWiXTv+jZK8hoAN9Othc=
                                              create=True mode=0644 path=/tmp/ansible.in05qk5a state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:38 compute-0 sudo[128394]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:38 compute-0 ceph-mon[75031]: pgmap v260: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:38 compute-0 sudo[128546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmzwjlwdolpembeszrqqaupknybejkps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580838.2445743-76-77510693012558/AnsiballZ_command.py'
Dec 01 09:20:38 compute-0 sudo[128546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:38 compute-0 python3.9[128548]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.in05qk5a' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:20:38 compute-0 sudo[128546]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v261: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:39 compute-0 sudo[128700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffgnygfhbwkjgyraahfcymvnydrbxfkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580839.0973425-84-66423821684536/AnsiballZ_file.py'
Dec 01 09:20:39 compute-0 sudo[128700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:39 compute-0 python3.9[128702]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.in05qk5a state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:39 compute-0 sudo[128700]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:40 compute-0 sshd-session[127355]: Connection closed by 192.168.122.30 port 54144
Dec 01 09:20:40 compute-0 sshd-session[127352]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:20:40 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Dec 01 09:20:40 compute-0 systemd[1]: session-41.scope: Consumed 5.662s CPU time.
Dec 01 09:20:40 compute-0 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Dec 01 09:20:40 compute-0 systemd-logind[788]: Removed session 41.
Dec 01 09:20:40 compute-0 ceph-mon[75031]: pgmap v261: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v262: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:42 compute-0 ceph-mon[75031]: pgmap v262: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:20:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:20:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:20:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:20:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:20:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:20:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v263: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:43 compute-0 sshd-session[71303]: Received disconnect from 38.102.83.177 port 55786:11: disconnected by user
Dec 01 09:20:43 compute-0 sshd-session[71303]: Disconnected from user zuul 38.102.83.177 port 55786
Dec 01 09:20:43 compute-0 sshd-session[71300]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:20:43 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Dec 01 09:20:43 compute-0 systemd[1]: session-18.scope: Consumed 1min 27.611s CPU time.
Dec 01 09:20:43 compute-0 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Dec 01 09:20:43 compute-0 systemd-logind[788]: Removed session 18.
Dec 01 09:20:44 compute-0 ceph-mon[75031]: pgmap v263: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v264: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:45 compute-0 sshd-session[128728]: Accepted publickey for zuul from 192.168.122.30 port 58598 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:20:45 compute-0 systemd-logind[788]: New session 42 of user zuul.
Dec 01 09:20:45 compute-0 systemd[1]: Started Session 42 of User zuul.
Dec 01 09:20:45 compute-0 sshd-session[128728]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:20:46 compute-0 ceph-mon[75031]: pgmap v264: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:46 compute-0 python3.9[128881]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:20:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v265: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:47 compute-0 sudo[129035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-capbxxjwzepzqzvjajdqiiqbmnccsndw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580847.197676-32-165420026144804/AnsiballZ_systemd.py'
Dec 01 09:20:47 compute-0 sudo[129035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:48 compute-0 python3.9[129037]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 01 09:20:48 compute-0 sudo[129035]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:48 compute-0 ceph-mon[75031]: pgmap v265: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:48 compute-0 sudo[129189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mivewufrzuzebbnmnvlxgdyjfqewocon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580848.3462124-40-11929941922389/AnsiballZ_systemd.py'
Dec 01 09:20:48 compute-0 sudo[129189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:48 compute-0 python3.9[129191]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:20:49 compute-0 sudo[129189]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v266: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:49 compute-0 sudo[129342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrnqoauinobmegxoxxhmjvxneeaetyqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580849.2480092-49-87089885339247/AnsiballZ_command.py'
Dec 01 09:20:49 compute-0 sudo[129342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:49 compute-0 python3.9[129344]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:20:50 compute-0 sudo[129342]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:50 compute-0 ceph-mon[75031]: pgmap v266: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:50 compute-0 sudo[129495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmkktcdpvfyqqftpsrviytauowvkmjjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580850.2090688-57-112814106416589/AnsiballZ_stat.py'
Dec 01 09:20:50 compute-0 sudo[129495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:50 compute-0 python3.9[129497]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:20:50 compute-0 sudo[129495]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v267: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:51 compute-0 sudo[129647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogoywlukpyifvzykhxjmycvkfoaapjpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580851.1503463-66-278212250775243/AnsiballZ_file.py'
Dec 01 09:20:51 compute-0 sudo[129647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:20:51 compute-0 python3.9[129649]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:20:51 compute-0 sudo[129647]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:52 compute-0 sshd-session[128731]: Connection closed by 192.168.122.30 port 58598
Dec 01 09:20:52 compute-0 sshd-session[128728]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:20:52 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Dec 01 09:20:52 compute-0 systemd[1]: session-42.scope: Consumed 4.273s CPU time.
Dec 01 09:20:52 compute-0 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Dec 01 09:20:52 compute-0 systemd-logind[788]: Removed session 42.
Dec 01 09:20:52 compute-0 ceph-mon[75031]: pgmap v267: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v268: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:54 compute-0 ceph-mon[75031]: pgmap v268: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #18. Immutable memtables: 0.
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.566451) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 18
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854566563, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6560, "num_deletes": 251, "total_data_size": 7020721, "memory_usage": 7223840, "flush_reason": "Manual Compaction"}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #19: started
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854605564, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 19, "file_size": 5332367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 132, "largest_seqno": 6689, "table_properties": {"data_size": 5309824, "index_size": 14365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7109, "raw_key_size": 62773, "raw_average_key_size": 22, "raw_value_size": 5256505, "raw_average_value_size": 1863, "num_data_blocks": 647, "num_entries": 2821, "num_filter_entries": 2821, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580342, "oldest_key_time": 1764580342, "file_creation_time": 1764580854, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 19, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 39178 microseconds, and 17125 cpu microseconds.
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.605633) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #19: 5332367 bytes OK
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.605668) [db/memtable_list.cc:519] [default] Level-0 commit table #19 started
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.607044) [db/memtable_list.cc:722] [default] Level-0 commit table #19: memtable #1 done
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.607062) EVENT_LOG_v1 {"time_micros": 1764580854607057, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [3, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.607091) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[3 0 0 0 0 0 0] max score 0.75
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 6993030, prev total WAL file size 6993030, number of live WAL files 2.
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000014.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.609258) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 3@0 files to L6, score -1.00
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [19(5207KB) 13(50KB) 8(1944B)]
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854609527, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [19, 13, 8], "score": -1, "input_data_size": 5386105, "oldest_snapshot_seqno": -1}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #20: 2633 keys, 5343204 bytes, temperature: kUnknown
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854659891, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 20, "file_size": 5343204, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5321116, "index_size": 14427, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6597, "raw_key_size": 60750, "raw_average_key_size": 23, "raw_value_size": 5269488, "raw_average_value_size": 2001, "num_data_blocks": 649, "num_entries": 2633, "num_filter_entries": 2633, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764580854, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.660217) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 3@0 files to L6 => 5343204 bytes
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.661662) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.7 rd, 105.9 wr, level 6, files in(3, 0) out(1 +0 blob) MB in(5.1, 0.0 +0.0 blob) out(5.1 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 2922, records dropped: 289 output_compression: NoCompression
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.661700) EVENT_LOG_v1 {"time_micros": 1764580854661681, "job": 4, "event": "compaction_finished", "compaction_time_micros": 50469, "compaction_time_cpu_micros": 29139, "output_level": 6, "num_output_files": 1, "total_output_size": 5343204, "num_input_records": 2922, "num_output_records": 2633, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000019.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854664698, "job": 4, "event": "table_file_deletion", "file_number": 19}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000013.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854664812, "job": 4, "event": "table_file_deletion", "file_number": 13}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580854664880, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 01 09:20:54 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:20:54.609016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:20:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v269: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:56 compute-0 ceph-mon[75031]: pgmap v269: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v270: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:58 compute-0 sshd-session[129675]: Accepted publickey for zuul from 192.168.122.30 port 56392 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:20:58 compute-0 systemd-logind[788]: New session 43 of user zuul.
Dec 01 09:20:58 compute-0 systemd[1]: Started Session 43 of User zuul.
Dec 01 09:20:58 compute-0 sshd-session[129675]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:20:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:20:58 compute-0 ceph-mon[75031]: pgmap v270: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v271: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:20:59 compute-0 python3.9[129828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:21:00 compute-0 sudo[129982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxgtltwmxhcgrvpsbxwqoersmailiuyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580859.8124695-34-128259516304302/AnsiballZ_setup.py'
Dec 01 09:21:00 compute-0 sudo[129982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:00 compute-0 python3.9[129984]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:21:00 compute-0 ceph-mon[75031]: pgmap v271: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:00 compute-0 sudo[129982]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:01 compute-0 sudo[130066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlctcvfdtgkkwhwbxsclscygqdyihshu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580859.8124695-34-128259516304302/AnsiballZ_dnf.py'
Dec 01 09:21:01 compute-0 sudo[130066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v272: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:01 compute-0 python3.9[130068]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:21:02 compute-0 ceph-mon[75031]: pgmap v272: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:02 compute-0 sudo[130066]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v273: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:03 compute-0 python3.9[130219]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:21:04 compute-0 ceph-mon[75031]: pgmap v273: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:05 compute-0 python3.9[130370]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:21:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v274: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:05 compute-0 python3.9[130520]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:21:06 compute-0 python3.9[130670]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:21:06 compute-0 ceph-mon[75031]: pgmap v274: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v275: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:07 compute-0 sshd-session[129678]: Connection closed by 192.168.122.30 port 56392
Dec 01 09:21:07 compute-0 sshd-session[129675]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:21:07 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Dec 01 09:21:07 compute-0 systemd[1]: session-43.scope: Consumed 6.806s CPU time.
Dec 01 09:21:07 compute-0 systemd-logind[788]: Session 43 logged out. Waiting for processes to exit.
Dec 01 09:21:07 compute-0 systemd-logind[788]: Removed session 43.
Dec 01 09:21:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:08 compute-0 ceph-mon[75031]: pgmap v275: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v276: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:10 compute-0 ceph-mon[75031]: pgmap v276: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v277: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:12 compute-0 sshd-session[130695]: Accepted publickey for zuul from 192.168.122.30 port 43580 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:21:12 compute-0 systemd-logind[788]: New session 44 of user zuul.
Dec 01 09:21:12 compute-0 systemd[1]: Started Session 44 of User zuul.
Dec 01 09:21:12 compute-0 ceph-mon[75031]: pgmap v277: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:12 compute-0 sshd-session[130695]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:21:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:21:12
Dec 01 09:21:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:21:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:21:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'vms', 'backups', 'images', '.mgr', 'cephfs.cephfs.data', 'volumes']
Dec 01 09:21:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:21:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v278: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:13 compute-0 python3.9[130848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:21:14 compute-0 ceph-mon[75031]: pgmap v278: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v279: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:15 compute-0 sudo[131002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixiivyhbekunbpzhuiatssluvszyilhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580875.0555987-50-70473933975586/AnsiballZ_file.py'
Dec 01 09:21:15 compute-0 sudo[131002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:15 compute-0 python3.9[131004]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:15 compute-0 sudo[131002]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:16 compute-0 sudo[131154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnbreoguzyersqyiscgpywuitbghyomb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580875.9657946-50-108825829764011/AnsiballZ_file.py'
Dec 01 09:21:16 compute-0 sudo[131154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:16 compute-0 python3.9[131156]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:16 compute-0 sudo[131154]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:16 compute-0 ceph-mon[75031]: pgmap v279: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v280: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:17 compute-0 sudo[131306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwlkjwoqbcyuxklwdmnogcwzhazxsngv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580876.7077625-65-248268948263347/AnsiballZ_stat.py'
Dec 01 09:21:17 compute-0 sudo[131306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:17 compute-0 python3.9[131308]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:17 compute-0 sudo[131306]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:18 compute-0 sudo[131429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emreoqfobmavbtptsvxvuypcqncohgia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580876.7077625-65-248268948263347/AnsiballZ_copy.py'
Dec 01 09:21:18 compute-0 sudo[131429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:18 compute-0 python3.9[131431]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580876.7077625-65-248268948263347/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d66b669105720ca8abb42d3b5b02733184f83aa9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:18 compute-0 sudo[131429]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:21:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:21:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:18 compute-0 sudo[131555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:18 compute-0 sudo[131555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:18 compute-0 sudo[131555]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:18 compute-0 sudo[131604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxqzwotqkbkoenitkawlngwwfcevardl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580878.5105698-65-42796182260776/AnsiballZ_stat.py'
Dec 01 09:21:18 compute-0 sudo[131604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:18 compute-0 ceph-mon[75031]: pgmap v280: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:18 compute-0 sudo[131607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:21:18 compute-0 sudo[131607]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:18 compute-0 sudo[131607]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:18 compute-0 sudo[131634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:18 compute-0 sudo[131634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:18 compute-0 sudo[131634]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:19 compute-0 sudo[131659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:21:19 compute-0 sudo[131659]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:19 compute-0 python3.9[131613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:19 compute-0 sudo[131604]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v281: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:19 compute-0 sudo[131835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmkpcmnltluensoburklueapfcolipmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580878.5105698-65-42796182260776/AnsiballZ_copy.py'
Dec 01 09:21:19 compute-0 sudo[131659]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:19 compute-0 sudo[131835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:21:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:21:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:21:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:21:19 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 539a95bf-07d9-4bab-a25b-c5f8ba7b37b9 does not exist
Dec 01 09:21:19 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev b48f8752-d89b-4b48-ac28-ed31ce7d11d5 does not exist
Dec 01 09:21:19 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev fe6b68ed-a32c-418a-a387-c5d1ec2acbce does not exist
Dec 01 09:21:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:21:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:21:19 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:21:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:21:19 compute-0 sudo[131838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:19 compute-0 sudo[131838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:19 compute-0 sudo[131838]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:19 compute-0 python3.9[131837]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580878.5105698-65-42796182260776/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=08ea373ef3298387cada856fae069f7192f4a06e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:19 compute-0 sudo[131863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:21:19 compute-0 sudo[131863]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:19 compute-0 sudo[131863]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:19 compute-0 sudo[131835]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:19 compute-0 sudo[131888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:19 compute-0 sudo[131888]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:19 compute-0 sudo[131888]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:19 compute-0 sudo[131937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:21:19 compute-0 sudo[131937]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:21:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:21:19 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:21:20 compute-0 sudo[132129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raupanelilahmkjmmxjeajioedsxupoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580879.8078876-65-126916532795187/AnsiballZ_stat.py'
Dec 01 09:21:20 compute-0 sudo[132129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:20 compute-0 podman[132125]: 2025-12-01 09:21:20.130513924 +0000 UTC m=+0.051502481 container create 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:21:20 compute-0 systemd[1]: Started libpod-conmon-20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65.scope.
Dec 01 09:21:20 compute-0 podman[132125]: 2025-12-01 09:21:20.110423157 +0000 UTC m=+0.031411744 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:21:20 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:21:20 compute-0 podman[132125]: 2025-12-01 09:21:20.226862474 +0000 UTC m=+0.147851061 container init 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:21:20 compute-0 podman[132125]: 2025-12-01 09:21:20.240468765 +0000 UTC m=+0.161457332 container start 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:21:20 compute-0 podman[132125]: 2025-12-01 09:21:20.244891982 +0000 UTC m=+0.165880569 container attach 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:21:20 compute-0 keen_davinci[132147]: 167 167
Dec 01 09:21:20 compute-0 systemd[1]: libpod-20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65.scope: Deactivated successfully.
Dec 01 09:21:20 compute-0 podman[132125]: 2025-12-01 09:21:20.253901111 +0000 UTC m=+0.174889668 container died 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:21:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-7d36b12b6834588e3b147be31070e7e51c80bb87f496efd43eb80000371519a5-merged.mount: Deactivated successfully.
Dec 01 09:21:20 compute-0 python3.9[132139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:20 compute-0 podman[132125]: 2025-12-01 09:21:20.293942622 +0000 UTC m=+0.214931179 container remove 20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_davinci, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:21:20 compute-0 sudo[132129]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:20 compute-0 systemd[1]: libpod-conmon-20c69efbe6efd3fa9ad2784c370f2a3b3ef9e52dd3024506b64e8b65dd6f3a65.scope: Deactivated successfully.
Dec 01 09:21:20 compute-0 podman[132193]: 2025-12-01 09:21:20.464053063 +0000 UTC m=+0.052866491 container create d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:21:20 compute-0 systemd[1]: Started libpod-conmon-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope.
Dec 01 09:21:20 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:21:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:20 compute-0 podman[132193]: 2025-12-01 09:21:20.438384675 +0000 UTC m=+0.027198093 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:21:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:20 compute-0 podman[132193]: 2025-12-01 09:21:20.547537792 +0000 UTC m=+0.136351200 container init d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:21:20 compute-0 podman[132193]: 2025-12-01 09:21:20.554023649 +0000 UTC m=+0.142837067 container start d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:21:20 compute-0 podman[132193]: 2025-12-01 09:21:20.560435913 +0000 UTC m=+0.149249371 container attach d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:21:20 compute-0 sudo[132310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhrvehccjjjlhibqoyfgxemnvscdfbve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580879.8078876-65-126916532795187/AnsiballZ_copy.py'
Dec 01 09:21:20 compute-0 sudo[132310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:20 compute-0 ceph-mon[75031]: pgmap v281: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:21 compute-0 python3.9[132312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580879.8078876-65-126916532795187/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=e936f8afd9f4f5c6814ddbf37b17bc0751f9c27f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:21 compute-0 sudo[132310]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v282: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:21 compute-0 sudo[132482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecyprhxrwrbebmeqizwuxkzxcrjpnymv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580881.3215756-109-153706148114203/AnsiballZ_file.py'
Dec 01 09:21:21 compute-0 sudo[132482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:21 compute-0 nervous_proskuriakova[132255]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:21:21 compute-0 nervous_proskuriakova[132255]: --> relative data size: 1.0
Dec 01 09:21:21 compute-0 nervous_proskuriakova[132255]: --> All data devices are unavailable
Dec 01 09:21:21 compute-0 systemd[1]: libpod-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope: Deactivated successfully.
Dec 01 09:21:21 compute-0 systemd[1]: libpod-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope: Consumed 1.091s CPU time.
Dec 01 09:21:21 compute-0 podman[132193]: 2025-12-01 09:21:21.716239999 +0000 UTC m=+1.305053437 container died d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:21:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-73244b3d473a99837e152b8ce8b92c35256bce656eed056fd9144e9f192bf71a-merged.mount: Deactivated successfully.
Dec 01 09:21:21 compute-0 podman[132193]: 2025-12-01 09:21:21.787613991 +0000 UTC m=+1.376427399 container remove d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_proskuriakova, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:21:21 compute-0 systemd[1]: libpod-conmon-d996c883601752ea81c0328008fd97e2bb7a70a22f53dc0a1a6d4268319f40ed.scope: Deactivated successfully.
Dec 01 09:21:21 compute-0 sudo[131937]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:21 compute-0 python3.9[132484]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:21 compute-0 sudo[132482]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:21 compute-0 sudo[132501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:21 compute-0 sudo[132501]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:21 compute-0 sudo[132501]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:21 compute-0 sudo[132530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:21:21 compute-0 sudo[132530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:21 compute-0 sudo[132530]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:22 compute-0 sudo[132575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:22 compute-0 sudo[132575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:22 compute-0 sudo[132575]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:22 compute-0 sudo[132623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:21:22 compute-0 sudo[132623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:22 compute-0 sudo[132768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwrcrblovpmzummuhdpsflimaotoqdmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580882.021555-109-190945349628638/AnsiballZ_file.py'
Dec 01 09:21:22 compute-0 sudo[132768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:22 compute-0 podman[132792]: 2025-12-01 09:21:22.440890591 +0000 UTC m=+0.051424829 container create 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:21:22 compute-0 systemd[1]: Started libpod-conmon-24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca.scope.
Dec 01 09:21:22 compute-0 python3.9[132777]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:21:22 compute-0 podman[132792]: 2025-12-01 09:21:22.420264698 +0000 UTC m=+0.030798976 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:21:22 compute-0 sudo[132768]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:22 compute-0 podman[132792]: 2025-12-01 09:21:22.528349325 +0000 UTC m=+0.138883593 container init 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:21:22 compute-0 podman[132792]: 2025-12-01 09:21:22.535868791 +0000 UTC m=+0.146403039 container start 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:21:22 compute-0 podman[132792]: 2025-12-01 09:21:22.540046391 +0000 UTC m=+0.150580629 container attach 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:21:22 compute-0 eloquent_hopper[132808]: 167 167
Dec 01 09:21:22 compute-0 systemd[1]: libpod-24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca.scope: Deactivated successfully.
Dec 01 09:21:22 compute-0 podman[132792]: 2025-12-01 09:21:22.542608785 +0000 UTC m=+0.153143033 container died 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:21:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-f98b2a5458889717d101f816b7e8c68926f47ad83408dd4226c34ddea35b87da-merged.mount: Deactivated successfully.
Dec 01 09:21:22 compute-0 podman[132792]: 2025-12-01 09:21:22.631346576 +0000 UTC m=+0.241880844 container remove 24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_hopper, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:21:22 compute-0 systemd[1]: libpod-conmon-24fefc4f40d625a83395ddedd4e8a645e0998380e26ab2ac513de8caa5cd19ca.scope: Deactivated successfully.
Dec 01 09:21:22 compute-0 podman[132897]: 2025-12-01 09:21:22.84925793 +0000 UTC m=+0.055879637 container create 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:21:22 compute-0 ceph-mon[75031]: pgmap v282: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:22 compute-0 systemd[1]: Started libpod-conmon-1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533.scope.
Dec 01 09:21:22 compute-0 podman[132897]: 2025-12-01 09:21:22.826278329 +0000 UTC m=+0.032900046 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:21:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:21:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:22 compute-0 podman[132897]: 2025-12-01 09:21:22.949611765 +0000 UTC m=+0.156233532 container init 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:21:22 compute-0 podman[132897]: 2025-12-01 09:21:22.961502817 +0000 UTC m=+0.168124474 container start 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:21:22 compute-0 podman[132897]: 2025-12-01 09:21:22.965150082 +0000 UTC m=+0.171771849 container attach 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:21:23 compute-0 sudo[133003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qubarstophknmchefbgbpdlpqazyhgwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580882.7331283-124-17564990288686/AnsiballZ_stat.py'
Dec 01 09:21:23 compute-0 sudo[133003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v283: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:23 compute-0 python3.9[133005]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:23 compute-0 sudo[133003]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:23 compute-0 sudo[133126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfbphiwclvizoqnsaaejptzzgrldtcxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580882.7331283-124-17564990288686/AnsiballZ_copy.py'
Dec 01 09:21:23 compute-0 sudo[133126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]: {
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:     "0": [
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:         {
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "devices": [
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "/dev/loop3"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             ],
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_name": "ceph_lv0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_size": "21470642176",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "name": "ceph_lv0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "tags": {
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cluster_name": "ceph",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.crush_device_class": "",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.encrypted": "0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osd_id": "0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.type": "block",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.vdo": "0"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             },
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "type": "block",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "vg_name": "ceph_vg0"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:         }
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:     ],
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:     "1": [
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:         {
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "devices": [
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "/dev/loop4"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             ],
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_name": "ceph_lv1",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_size": "21470642176",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "name": "ceph_lv1",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "tags": {
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cluster_name": "ceph",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.crush_device_class": "",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.encrypted": "0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osd_id": "1",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.type": "block",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.vdo": "0"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             },
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "type": "block",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "vg_name": "ceph_vg1"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:         }
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:     ],
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:     "2": [
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:         {
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "devices": [
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "/dev/loop5"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             ],
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_name": "ceph_lv2",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_size": "21470642176",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "name": "ceph_lv2",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "tags": {
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.cluster_name": "ceph",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.crush_device_class": "",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.encrypted": "0",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osd_id": "2",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.type": "block",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:                 "ceph.vdo": "0"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             },
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "type": "block",
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:             "vg_name": "ceph_vg2"
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:         }
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]:     ]
Dec 01 09:21:23 compute-0 wizardly_hawking[132948]: }
Dec 01 09:21:23 compute-0 systemd[1]: libpod-1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533.scope: Deactivated successfully.
Dec 01 09:21:23 compute-0 podman[132897]: 2025-12-01 09:21:23.843428269 +0000 UTC m=+1.050049986 container died 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:21:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-89a673cc04a6dc9b95e247967adbb3267968a74c3c7ed3b38c6558a83dd2bbb6-merged.mount: Deactivated successfully.
Dec 01 09:21:23 compute-0 podman[132897]: 2025-12-01 09:21:23.918073834 +0000 UTC m=+1.124695501 container remove 1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_hawking, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:21:23 compute-0 python3.9[133130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580882.7331283-124-17564990288686/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d62b0903a77dc1bc7a4454e4946f7491a05b6027 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:23 compute-0 systemd[1]: libpod-conmon-1b19831dc58698563a279676b047bbea6e5529cad46bcd6d8bddf7e9daaeb533.scope: Deactivated successfully.
Dec 01 09:21:23 compute-0 sudo[132623]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:23 compute-0 sudo[133126]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:24 compute-0 sudo[133145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:24 compute-0 sudo[133145]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:24 compute-0 sudo[133145]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:24 compute-0 sudo[133194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:21:24 compute-0 sudo[133194]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:24 compute-0 sudo[133194]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:24 compute-0 sudo[133242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:24 compute-0 sudo[133242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:24 compute-0 sudo[133242]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:24 compute-0 sudo[133296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:21:24 compute-0 sudo[133296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:24 compute-0 sudo[133396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrahhkysknzddcgmtnexxhqbrsebpyvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580884.133848-124-121422586634669/AnsiballZ_stat.py'
Dec 01 09:21:24 compute-0 sudo[133396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:24 compute-0 python3.9[133408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:24 compute-0 sudo[133396]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:24 compute-0 podman[133437]: 2025-12-01 09:21:24.607774101 +0000 UTC m=+0.053658303 container create ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:21:24 compute-0 systemd[1]: Started libpod-conmon-ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a.scope.
Dec 01 09:21:24 compute-0 podman[133437]: 2025-12-01 09:21:24.58198953 +0000 UTC m=+0.027873762 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:21:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:21:24 compute-0 podman[133437]: 2025-12-01 09:21:24.718625928 +0000 UTC m=+0.164510150 container init ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:21:24 compute-0 podman[133437]: 2025-12-01 09:21:24.727784131 +0000 UTC m=+0.173668313 container start ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:21:24 compute-0 podman[133437]: 2025-12-01 09:21:24.731642172 +0000 UTC m=+0.177526374 container attach ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:21:24 compute-0 cranky_roentgen[133474]: 167 167
Dec 01 09:21:24 compute-0 systemd[1]: libpod-ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a.scope: Deactivated successfully.
Dec 01 09:21:24 compute-0 podman[133437]: 2025-12-01 09:21:24.736462291 +0000 UTC m=+0.182346483 container died ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:21:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-a959c51f468b6cc24e657b8bef059940c507937419c775c1b1abe47e8998657a-merged.mount: Deactivated successfully.
Dec 01 09:21:24 compute-0 podman[133437]: 2025-12-01 09:21:24.775668438 +0000 UTC m=+0.221552630 container remove ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cranky_roentgen, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:21:24 compute-0 systemd[1]: libpod-conmon-ee915ec2f9e69dc9dc0c3b08250884264ca0b0d2a7d22e773d92f0c89e80177a.scope: Deactivated successfully.
Dec 01 09:21:24 compute-0 ceph-mon[75031]: pgmap v283: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:24 compute-0 podman[133570]: 2025-12-01 09:21:24.965890336 +0000 UTC m=+0.058124312 container create 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:21:25 compute-0 sudo[133610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfaielckkylixdwdejuvagcfrvbqhphf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580884.133848-124-121422586634669/AnsiballZ_copy.py'
Dec 01 09:21:25 compute-0 sudo[133610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:25 compute-0 systemd[1]: Started libpod-conmon-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope.
Dec 01 09:21:25 compute-0 podman[133570]: 2025-12-01 09:21:24.943331678 +0000 UTC m=+0.035565774 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:21:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:21:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:21:25 compute-0 podman[133570]: 2025-12-01 09:21:25.094361429 +0000 UTC m=+0.186595425 container init 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:21:25 compute-0 podman[133570]: 2025-12-01 09:21:25.102234636 +0000 UTC m=+0.194468612 container start 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:21:25 compute-0 podman[133570]: 2025-12-01 09:21:25.105818539 +0000 UTC m=+0.198052515 container attach 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:21:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v284: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:25 compute-0 python3.9[133614]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580884.133848-124-121422586634669/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a4bf943053f807dd7cb30711eb355c9616e00332 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:25 compute-0 sudo[133610]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:25 compute-0 sudo[133769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoehiwvdazbbcgxgbbsfrwjzvqatbmix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580885.4008207-124-98226151290056/AnsiballZ_stat.py'
Dec 01 09:21:25 compute-0 sudo[133769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:25 compute-0 python3.9[133771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:25 compute-0 sudo[133769]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]: {
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "osd_id": 0,
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "type": "bluestore"
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:     },
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "osd_id": 1,
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "type": "bluestore"
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:     },
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "osd_id": 2,
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:         "type": "bluestore"
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]:     }
Dec 01 09:21:26 compute-0 pedantic_chatelet[133615]: }
Dec 01 09:21:26 compute-0 systemd[1]: libpod-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope: Deactivated successfully.
Dec 01 09:21:26 compute-0 systemd[1]: libpod-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope: Consumed 1.180s CPU time.
Dec 01 09:21:26 compute-0 podman[133570]: 2025-12-01 09:21:26.275161075 +0000 UTC m=+1.367395051 container died 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:21:26 compute-0 sudo[133931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndxijwueqegnguzjmppnhbhsmprkkmez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580885.4008207-124-98226151290056/AnsiballZ_copy.py'
Dec 01 09:21:26 compute-0 sudo[133931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:26 compute-0 python3.9[133933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580885.4008207-124-98226151290056/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=c8d7c531c4af3660f989db6faf3dfcfdf6ae5115 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-4e3db95974c039bd2c40ab7e8a4ef0c752ae71d80ffad7028216208056fa09ee-merged.mount: Deactivated successfully.
Dec 01 09:21:26 compute-0 sudo[133931]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:26 compute-0 podman[133570]: 2025-12-01 09:21:26.629549092 +0000 UTC m=+1.721783068 container remove 3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_chatelet, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:21:26 compute-0 systemd[1]: libpod-conmon-3900b830a819682448aa00baeedc52a9f2c7fe951ab79d9ab6d57757fe7a1535.scope: Deactivated successfully.
Dec 01 09:21:26 compute-0 sudo[133296]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:21:26 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:21:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:21:26 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:21:26 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev bd1567b8-0051-4df2-93ec-24ff509f4730 does not exist
Dec 01 09:21:26 compute-0 sudo[133959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:21:26 compute-0 sudo[133959]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:26 compute-0 sudo[133959]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:26 compute-0 sudo[133984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:21:26 compute-0 sudo[133984]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:21:26 compute-0 sudo[133984]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:26 compute-0 ceph-mon[75031]: pgmap v284: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:21:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:21:27 compute-0 sudo[134134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyglzucuacenyznykcacqsbnsswastft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580886.8448088-168-258692340528431/AnsiballZ_file.py'
Dec 01 09:21:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v285: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:27 compute-0 sudo[134134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:27 compute-0 python3.9[134136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:27 compute-0 sudo[134134]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:27 compute-0 sudo[134286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kimmflrliwqjtclrxqxjvsnmltrrbuuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580887.5783472-168-5089811669492/AnsiballZ_file.py'
Dec 01 09:21:27 compute-0 sudo[134286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:28 compute-0 python3.9[134288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:28 compute-0 sudo[134286]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:28 compute-0 sudo[134438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxluohfjlujreulfxpsputehimxjfczs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580888.3895655-183-96940074566238/AnsiballZ_stat.py'
Dec 01 09:21:28 compute-0 sudo[134438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:28 compute-0 ceph-mon[75031]: pgmap v285: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:29 compute-0 python3.9[134440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:29 compute-0 sudo[134438]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v286: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:29 compute-0 sudo[134561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbptndnlroyrypictmmxvegtmpmiptcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580888.3895655-183-96940074566238/AnsiballZ_copy.py'
Dec 01 09:21:29 compute-0 sudo[134561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:29 compute-0 python3.9[134563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580888.3895655-183-96940074566238/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=00b451d2ac8687242d7356b231cb87a2ffd182d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:29 compute-0 sudo[134561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:30 compute-0 sudo[134713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bffxmtyncyjzqgltdgddhooegycxgxji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580889.9793203-183-238090447328546/AnsiballZ_stat.py'
Dec 01 09:21:30 compute-0 sudo[134713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:30 compute-0 python3.9[134715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:30 compute-0 sudo[134713]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:30 compute-0 ceph-mon[75031]: pgmap v286: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:30 compute-0 sudo[134836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnkajlnprvzujmjktmltnjdupjrtptph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580889.9793203-183-238090447328546/AnsiballZ_copy.py'
Dec 01 09:21:30 compute-0 sudo[134836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:31 compute-0 python3.9[134838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580889.9793203-183-238090447328546/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a4bf943053f807dd7cb30711eb355c9616e00332 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:31 compute-0 sudo[134836]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v287: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:31 compute-0 sudo[134988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxksfcgkbxymrmbhxrbkbehdjxzzgqos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580891.3438542-183-212867816627443/AnsiballZ_stat.py'
Dec 01 09:21:31 compute-0 sudo[134988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:31 compute-0 python3.9[134990]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:31 compute-0 sudo[134988]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:32 compute-0 sudo[135111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vheclxduhmqdvppiwbqlsotmcdodmabt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580891.3438542-183-212867816627443/AnsiballZ_copy.py'
Dec 01 09:21:32 compute-0 sudo[135111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:32 compute-0 python3.9[135113]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580891.3438542-183-212867816627443/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=874ab53010d2695f4bed0a375bf3dde853ac0b72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:32 compute-0 sudo[135111]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:32 compute-0 ceph-mon[75031]: pgmap v287: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v288: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:33 compute-0 sudo[135263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyjzomwavisjngofgfbdbbsufdizeleo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580893.3601465-243-17282091455164/AnsiballZ_file.py'
Dec 01 09:21:33 compute-0 sudo[135263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:33 compute-0 python3.9[135265]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:33 compute-0 sudo[135263]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:34 compute-0 sudo[135415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdxdsjfyoxrqjrfcwevstwmgpwcgsvgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580894.1370416-251-227881741510994/AnsiballZ_stat.py'
Dec 01 09:21:34 compute-0 sudo[135415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:34 compute-0 python3.9[135417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:34 compute-0 sudo[135415]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:34 compute-0 ceph-mon[75031]: pgmap v288: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:35 compute-0 sudo[135538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwclelpdpkuheebkrezzhiljqrdrwxii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580894.1370416-251-227881741510994/AnsiballZ_copy.py'
Dec 01 09:21:35 compute-0 sudo[135538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v289: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:35 compute-0 python3.9[135540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580894.1370416-251-227881741510994/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:35 compute-0 sudo[135538]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:35 compute-0 sudo[135690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apfncyaddnfnfqngqrprhikzyrohbjrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580895.617539-267-225299631676316/AnsiballZ_file.py'
Dec 01 09:21:35 compute-0 sudo[135690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:36 compute-0 python3.9[135692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:36 compute-0 sudo[135690]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:36 compute-0 sudo[135842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwgtdvdexfndmfwsmjidzfkxwluzshvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580896.3355622-275-225724613973898/AnsiballZ_stat.py'
Dec 01 09:21:36 compute-0 sudo[135842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:36 compute-0 python3.9[135844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:36 compute-0 sudo[135842]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:36 compute-0 ceph-mon[75031]: pgmap v289: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v290: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:37 compute-0 sudo[135965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znepxkgfkecfhkqbesnuicuzjmqqiaif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580896.3355622-275-225724613973898/AnsiballZ_copy.py'
Dec 01 09:21:37 compute-0 sudo[135965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:37 compute-0 python3.9[135967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580896.3355622-275-225724613973898/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:37 compute-0 sudo[135965]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:38 compute-0 sudo[136117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnmiyaltcizsfdmxrjtmpglnkajnexnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580898.1433427-291-19538174348768/AnsiballZ_file.py'
Dec 01 09:21:38 compute-0 sudo[136117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:38 compute-0 python3.9[136119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:38 compute-0 sudo[136117]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:39 compute-0 ceph-mon[75031]: pgmap v290: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v291: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:39 compute-0 sudo[136269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwqdgxqlegjtiiakuzxtkhujlbarwqpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580898.892895-299-176142090972686/AnsiballZ_stat.py'
Dec 01 09:21:39 compute-0 sudo[136269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:39 compute-0 python3.9[136271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:39 compute-0 sudo[136269]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:39 compute-0 sudo[136392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azhdwmcoqrctucqqpfxdnsaflcncxhxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580898.892895-299-176142090972686/AnsiballZ_copy.py'
Dec 01 09:21:39 compute-0 sudo[136392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:40 compute-0 python3.9[136394]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580898.892895-299-176142090972686/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:40 compute-0 sudo[136392]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:40 compute-0 sudo[136544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yplodrbiyaazgifbfnlislxeaodtpffw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580900.3192546-315-63946442870040/AnsiballZ_file.py'
Dec 01 09:21:40 compute-0 sudo[136544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:40 compute-0 python3.9[136546]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:40 compute-0 sudo[136544]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:41 compute-0 ceph-mon[75031]: pgmap v291: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v292: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:41 compute-0 sudo[136696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djnwjtachqegfmtghsetjrvbprqcvomg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580901.044957-323-263294009601000/AnsiballZ_stat.py'
Dec 01 09:21:41 compute-0 sudo[136696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:41 compute-0 python3.9[136698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:41 compute-0 sudo[136696]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:41 compute-0 sudo[136819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tchnlaukoldlwsowtdzkvxxequftfwtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580901.044957-323-263294009601000/AnsiballZ_copy.py'
Dec 01 09:21:41 compute-0 sudo[136819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:42 compute-0 python3.9[136821]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580901.044957-323-263294009601000/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:42 compute-0 sudo[136819]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:42 compute-0 sudo[136971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snwmkdhwuduxoifjuepjtywrsjcbohro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580902.5828228-339-247501658418024/AnsiballZ_file.py'
Dec 01 09:21:42 compute-0 sudo[136971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:43 compute-0 ceph-mon[75031]: pgmap v292: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:21:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:21:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:21:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:21:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:21:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:21:43 compute-0 python3.9[136973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:43 compute-0 sudo[136971]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v293: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:43 compute-0 sudo[137123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjzczyvaseumvohaomdqattqpwtpbsst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580903.3048897-347-153379687237424/AnsiballZ_stat.py'
Dec 01 09:21:43 compute-0 sudo[137123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:43 compute-0 python3.9[137125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:43 compute-0 sudo[137123]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:44 compute-0 sudo[137246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdvvpzftuzsdbyljtyrhdlxbysekeyrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580903.3048897-347-153379687237424/AnsiballZ_copy.py'
Dec 01 09:21:44 compute-0 sudo[137246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:44 compute-0 python3.9[137248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580903.3048897-347-153379687237424/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:44 compute-0 sudo[137246]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:45 compute-0 ceph-mon[75031]: pgmap v293: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:45 compute-0 sudo[137398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caflhjwhrgndysjlahuvsiaoelqghrgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580904.719547-363-180201173789218/AnsiballZ_file.py'
Dec 01 09:21:45 compute-0 sudo[137398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v294: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:45 compute-0 python3.9[137400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:21:45 compute-0 sudo[137398]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:45 compute-0 sudo[137550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbwhcyczctunwqhbsfsmqydnsjrgxpkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580905.4577227-371-173330415019643/AnsiballZ_stat.py'
Dec 01 09:21:45 compute-0 sudo[137550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:46 compute-0 python3.9[137552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:46 compute-0 sudo[137550]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:46 compute-0 sudo[137673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zafgedjefaxxpfkrqpsaklizjfyjpile ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580905.4577227-371-173330415019643/AnsiballZ_copy.py'
Dec 01 09:21:46 compute-0 sudo[137673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:46 compute-0 python3.9[137675]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580905.4577227-371-173330415019643/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=30f43b3b193e6fb640de0bf588bd9062982dce0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:46 compute-0 sudo[137673]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:47 compute-0 ceph-mon[75031]: pgmap v294: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:47 compute-0 sshd-session[130698]: Connection closed by 192.168.122.30 port 43580
Dec 01 09:21:47 compute-0 sshd-session[130695]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:21:47 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Dec 01 09:21:47 compute-0 systemd[1]: session-44.scope: Consumed 27.010s CPU time.
Dec 01 09:21:47 compute-0 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Dec 01 09:21:47 compute-0 systemd-logind[788]: Removed session 44.
Dec 01 09:21:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v295: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:49 compute-0 ceph-mon[75031]: pgmap v295: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v296: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:51 compute-0 ceph-mon[75031]: pgmap v296: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v297: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:52 compute-0 sshd-session[137700]: Accepted publickey for zuul from 192.168.122.30 port 41486 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:21:52 compute-0 systemd-logind[788]: New session 45 of user zuul.
Dec 01 09:21:52 compute-0 systemd[1]: Started Session 45 of User zuul.
Dec 01 09:21:52 compute-0 sshd-session[137700]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:21:53 compute-0 sudo[137853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viyhbqxgprhlcjsvhyzhytlaqeuggjsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580912.5379786-22-73268682080879/AnsiballZ_file.py'
Dec 01 09:21:53 compute-0 sudo[137853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:53 compute-0 ceph-mon[75031]: pgmap v297: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v298: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:53 compute-0 python3.9[137855]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:53 compute-0 sudo[137853]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:53 compute-0 sudo[138005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smaiehkhftsuccshfgcphqanjyzrbtnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580913.446532-34-232238526912525/AnsiballZ_stat.py'
Dec 01 09:21:53 compute-0 sudo[138005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:54 compute-0 python3.9[138007]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:54 compute-0 sudo[138005]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:54 compute-0 sudo[138128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqrivaunjhjhdiqxtytrqzfjwlpjpytb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580913.446532-34-232238526912525/AnsiballZ_copy.py'
Dec 01 09:21:54 compute-0 sudo[138128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:54 compute-0 python3.9[138130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580913.446532-34-232238526912525/.source.conf _original_basename=ceph.conf follow=False checksum=2bbdee6ce99be2e18e11631e7462d3c1fd9af211 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:54 compute-0 sudo[138128]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:55 compute-0 ceph-mon[75031]: pgmap v298: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v299: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:55 compute-0 sudo[138280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfxowpxbtjuruxmceuyyjcmwqvbuemvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580915.0227323-34-24352050980876/AnsiballZ_stat.py'
Dec 01 09:21:55 compute-0 sudo[138280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:55 compute-0 python3.9[138282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:21:55 compute-0 sudo[138280]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:56 compute-0 sudo[138403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thvgxfebulqccfzuzkskiisqjcylfsmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580915.0227323-34-24352050980876/AnsiballZ_copy.py'
Dec 01 09:21:56 compute-0 sudo[138403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:56 compute-0 python3.9[138405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580915.0227323-34-24352050980876/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=30c595aa84bea916cfc9cc906a8788f27659122a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:56 compute-0 sudo[138403]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:56 compute-0 sshd-session[137703]: Connection closed by 192.168.122.30 port 41486
Dec 01 09:21:56 compute-0 sshd-session[137700]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:21:56 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Dec 01 09:21:56 compute-0 systemd[1]: session-45.scope: Consumed 2.859s CPU time.
Dec 01 09:21:56 compute-0 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Dec 01 09:21:56 compute-0 systemd-logind[788]: Removed session 45.
Dec 01 09:21:57 compute-0 ceph-mon[75031]: pgmap v299: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v300: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:21:59 compute-0 ceph-mon[75031]: pgmap v300: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:21:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v301: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:01 compute-0 ceph-mon[75031]: pgmap v301: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v302: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:03 compute-0 ceph-mon[75031]: pgmap v302: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v303: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:03 compute-0 sshd-session[138430]: Accepted publickey for zuul from 192.168.122.30 port 60188 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:22:03 compute-0 systemd-logind[788]: New session 46 of user zuul.
Dec 01 09:22:03 compute-0 systemd[1]: Started Session 46 of User zuul.
Dec 01 09:22:03 compute-0 sshd-session[138430]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:22:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #21. Immutable memtables: 0.
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.530142) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 21
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923530378, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 768, "num_deletes": 250, "total_data_size": 675094, "memory_usage": 689840, "flush_reason": "Manual Compaction"}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #22: started
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923541330, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 22, "file_size": 434410, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6690, "largest_seqno": 7457, "table_properties": {"data_size": 431164, "index_size": 1090, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8215, "raw_average_key_size": 19, "raw_value_size": 424355, "raw_average_value_size": 1005, "num_data_blocks": 51, "num_entries": 422, "num_filter_entries": 422, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580855, "oldest_key_time": 1764580855, "file_creation_time": 1764580923, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 22, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 11148 microseconds, and 3741 cpu microseconds.
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.541390) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #22: 434410 bytes OK
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.541414) [db/memtable_list.cc:519] [default] Level-0 commit table #22 started
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.542960) [db/memtable_list.cc:722] [default] Level-0 commit table #22: memtable #1 done
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.542977) EVENT_LOG_v1 {"time_micros": 1764580923542970, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.543000) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 671220, prev total WAL file size 671220, number of live WAL files 2.
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000018.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.543572) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [22(424KB)], [20(5217KB)]
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923543656, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [22], "files_L6": [20], "score": -1, "input_data_size": 5777614, "oldest_snapshot_seqno": -1}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #23: 2571 keys, 4204786 bytes, temperature: kUnknown
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923582006, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 23, "file_size": 4204786, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4186103, "index_size": 11150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6469, "raw_key_size": 59820, "raw_average_key_size": 23, "raw_value_size": 4138447, "raw_average_value_size": 1609, "num_data_blocks": 507, "num_entries": 2571, "num_filter_entries": 2571, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764580923, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.582395) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 4204786 bytes
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.584282) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.2 rd, 109.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 5.1 +0.0 blob) out(4.0 +0.0 blob), read-write-amplify(23.0) write-amplify(9.7) OK, records in: 3055, records dropped: 484 output_compression: NoCompression
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.584337) EVENT_LOG_v1 {"time_micros": 1764580923584320, "job": 6, "event": "compaction_finished", "compaction_time_micros": 38463, "compaction_time_cpu_micros": 18178, "output_level": 6, "num_output_files": 1, "total_output_size": 4204786, "num_input_records": 3055, "num_output_records": 2571, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000022.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923584589, "job": 6, "event": "table_file_deletion", "file_number": 22}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764580923585942, "job": 6, "event": "table_file_deletion", "file_number": 20}
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.543451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586137) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586140) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:22:03 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:22:03.586147) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:22:04 compute-0 python3.9[138583]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:22:05 compute-0 ceph-mon[75031]: pgmap v303: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v304: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:05 compute-0 sudo[138737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpkaoxczbihnzvhrlbljyjdblijuaekf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580925.0527325-34-280044102565088/AnsiballZ_file.py'
Dec 01 09:22:05 compute-0 sudo[138737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:05 compute-0 python3.9[138739]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:05 compute-0 sudo[138737]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:06 compute-0 sudo[138889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxrbvbjhfnqgwvlsrjrdzakaetvfkmtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580926.02691-34-109806067097369/AnsiballZ_file.py'
Dec 01 09:22:06 compute-0 sudo[138889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:06 compute-0 python3.9[138891]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:06 compute-0 sudo[138889]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v305: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:07 compute-0 python3.9[139041]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:22:07 compute-0 ceph-mon[75031]: pgmap v304: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:08 compute-0 sudo[139191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgsmuftptqcdpppczgiykgofjzinhsyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580927.509547-57-128204866520808/AnsiballZ_seboolean.py'
Dec 01 09:22:08 compute-0 sudo[139191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:08 compute-0 python3.9[139193]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 01 09:22:08 compute-0 ceph-mon[75031]: pgmap v305: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v306: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:09 compute-0 sudo[139191]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:10 compute-0 sudo[139347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grofigunvvrudihavcqxtgrhtmwlhdwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580929.70971-67-142450362619182/AnsiballZ_setup.py'
Dec 01 09:22:10 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 01 09:22:10 compute-0 sudo[139347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:10 compute-0 python3.9[139349]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:22:10 compute-0 ceph-mon[75031]: pgmap v306: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:10 compute-0 sudo[139347]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:11 compute-0 sudo[139431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngfmvenesbwtioibuquqplacpwyjujpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580929.70971-67-142450362619182/AnsiballZ_dnf.py'
Dec 01 09:22:11 compute-0 sudo[139431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v307: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:11 compute-0 python3.9[139433]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:22:12 compute-0 ceph-mon[75031]: pgmap v307: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:12 compute-0 sudo[139431]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:22:12
Dec 01 09:22:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:22:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:22:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', '.mgr', 'images', 'cephfs.cephfs.data', 'vms', 'cephfs.cephfs.meta', 'volumes']
Dec 01 09:22:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:22:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v308: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:13 compute-0 sudo[139584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsmqwgusrjvdjesqbjunwbnxsgopaksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580932.8947237-79-53615983196736/AnsiballZ_systemd.py'
Dec 01 09:22:13 compute-0 sudo[139584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:13 compute-0 python3.9[139586]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:22:14 compute-0 sudo[139584]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:14 compute-0 ceph-mon[75031]: pgmap v308: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:14 compute-0 sudo[139739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jneywnkbpyoacqzhmiviwnsnfzbozori ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764580934.2504473-87-99358576822514/AnsiballZ_edpm_nftables_snippet.py'
Dec 01 09:22:14 compute-0 sudo[139739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:15 compute-0 python3[139741]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 01 09:22:15 compute-0 sudo[139739]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v309: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:15 compute-0 sudo[139891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbuhvlojyjbddhebevuzwjijsnsayuhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580935.435741-96-111582113360480/AnsiballZ_file.py'
Dec 01 09:22:15 compute-0 sudo[139891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:16 compute-0 python3.9[139893]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:16 compute-0 sudo[139891]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:16 compute-0 ceph-mon[75031]: pgmap v309: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:16 compute-0 sudo[140043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgxshprxuevedlwqnazgujkplokfuzbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580936.2841356-104-269707065315893/AnsiballZ_stat.py'
Dec 01 09:22:16 compute-0 sudo[140043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:16 compute-0 python3.9[140045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:17 compute-0 sudo[140043]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v310: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:17 compute-0 sudo[140121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pggzlqpvasughnjkyofbjjhqnmncwmfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580936.2841356-104-269707065315893/AnsiballZ_file.py'
Dec 01 09:22:17 compute-0 sudo[140121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:17 compute-0 python3.9[140123]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:17 compute-0 sudo[140121]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:17 compute-0 sudo[140273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vliyyiletcvvzvgwrhapvdmechuqzfwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580937.673329-116-185840315954906/AnsiballZ_stat.py'
Dec 01 09:22:17 compute-0 sudo[140273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:18 compute-0 python3.9[140275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:18 compute-0 sudo[140273]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:22:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:22:18 compute-0 sudo[140351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brwqovsgxprptihpnulkxxnkdfyttedh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580937.673329-116-185840315954906/AnsiballZ_file.py'
Dec 01 09:22:18 compute-0 sudo[140351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:18 compute-0 ceph-mon[75031]: pgmap v310: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:18 compute-0 python3.9[140353]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.r7zapobs recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:18 compute-0 sudo[140351]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v311: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:19 compute-0 sudo[140503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-divyufibvjobymgvewlwofrkguoolksx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580938.9089193-128-263034161340842/AnsiballZ_stat.py'
Dec 01 09:22:19 compute-0 sudo[140503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:19 compute-0 python3.9[140505]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:19 compute-0 sudo[140503]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:19 compute-0 sudo[140581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umpcjcqbufubfknfirfobeybztplvrqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580938.9089193-128-263034161340842/AnsiballZ_file.py'
Dec 01 09:22:19 compute-0 sudo[140581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:20 compute-0 python3.9[140583]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:20 compute-0 sudo[140581]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:20 compute-0 ceph-mon[75031]: pgmap v311: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:20 compute-0 sudo[140733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qabsejjsddtnmmxxqrykwezlwbtmgsng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580940.3184793-141-86745022019671/AnsiballZ_command.py'
Dec 01 09:22:20 compute-0 sudo[140733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:21 compute-0 python3.9[140735]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:22:21 compute-0 sudo[140733]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v312: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:21 compute-0 sudo[140886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwhbdzcyelivpcblnabvpzhjrftidudk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764580941.332958-149-172626319640179/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 09:22:21 compute-0 sudo[140886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:22 compute-0 python3[140888]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 09:22:22 compute-0 sudo[140886]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:22 compute-0 sudo[141038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzjechderaxpjyfzgyiowqirpjawstlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580942.225214-157-231396235722568/AnsiballZ_stat.py'
Dec 01 09:22:22 compute-0 sudo[141038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:22 compute-0 ceph-mon[75031]: pgmap v312: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:22 compute-0 python3.9[141040]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:22 compute-0 sudo[141038]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:22:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 1726 writes, 7468 keys, 1726 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.01 MB/s
                                           Cumulative WAL: 1726 writes, 1726 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1726 writes, 7468 keys, 1726 commit groups, 1.0 writes per commit group, ingest: 7.48 MB, 0.01 MB/s
                                           Interval WAL: 1726 writes, 1726 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    105.0      0.05              0.02         3    0.018       0      0       0.0       0.0
                                             L6      1/0    4.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.6    119.7    102.4      0.09              0.05         2    0.044    5977    773       0.0       0.0
                                            Sum      1/0    4.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.6     75.1    103.4      0.14              0.07         5    0.028    5977    773       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     76.5    104.9      0.14              0.07         4    0.035    5977    773       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    119.7    102.4      0.09              0.05         2    0.044    5977    773       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    109.3      0.05              0.02         2    0.025       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.005, interval 0.005
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.01 GB write, 0.02 MB/s write, 0.01 GB read, 0.02 MB/s read, 0.1 seconds
                                           Interval compaction: 0.01 GB write, 0.02 MB/s write, 0.01 GB read, 0.02 MB/s read, 0.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55bbd56b51f0#2 capacity: 308.00 MB usage: 560.92 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(36,492.47 KB,0.156145%) FilterBlock(6,22.67 KB,0.00718847%) IndexBlock(6,45.78 KB,0.0145157%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 09:22:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v313: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:23 compute-0 sudo[141163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edcjtyqgicpmwfjisgnwpqeycofhkmkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580942.225214-157-231396235722568/AnsiballZ_copy.py'
Dec 01 09:22:23 compute-0 sudo[141163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:23 compute-0 python3.9[141165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580942.225214-157-231396235722568/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:23 compute-0 sudo[141163]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:24 compute-0 sudo[141315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxwyfyjnwqcqfmqocedebtfbvccuphkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580943.7514427-172-165017964126906/AnsiballZ_stat.py'
Dec 01 09:22:24 compute-0 sudo[141315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:24 compute-0 python3.9[141317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:24 compute-0 sudo[141315]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:24 compute-0 ceph-mon[75031]: pgmap v313: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:24 compute-0 sudo[141440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyaezlnnpdfdunxdfrstvecegqxydhcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580943.7514427-172-165017964126906/AnsiballZ_copy.py'
Dec 01 09:22:24 compute-0 sudo[141440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:24 compute-0 python3.9[141442]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580943.7514427-172-165017964126906/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:24 compute-0 sudo[141440]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v314: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:25 compute-0 sudo[141592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awulxqoiebmfqrkducprhjbkxfldjacx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580945.1733842-187-192928384059708/AnsiballZ_stat.py'
Dec 01 09:22:25 compute-0 sudo[141592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:25 compute-0 python3.9[141594]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:25 compute-0 sudo[141592]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:26 compute-0 sudo[141717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frhpbsbgtmmmwxzepajoaegzcrzttxji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580945.1733842-187-192928384059708/AnsiballZ_copy.py'
Dec 01 09:22:26 compute-0 sudo[141717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:26 compute-0 python3.9[141719]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580945.1733842-187-192928384059708/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:26 compute-0 sudo[141717]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:26 compute-0 ceph-mon[75031]: pgmap v314: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:26 compute-0 sudo[141869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noldpittikrmblcidddkabpxkyknysve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580946.4221358-202-42183785616875/AnsiballZ_stat.py'
Dec 01 09:22:26 compute-0 sudo[141869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:26 compute-0 sudo[141872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:26 compute-0 sudo[141872]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:26 compute-0 sudo[141872]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:26 compute-0 sudo[141897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:22:26 compute-0 sudo[141897]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:26 compute-0 sudo[141897]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 python3.9[141871]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:27 compute-0 sudo[141922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:27 compute-0 sudo[141922]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:27 compute-0 sudo[141922]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 sudo[141869]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 sudo[141949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:22:27 compute-0 sudo[141949]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v315: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:27 compute-0 sudo[142111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwxhxmxjamqjjubsohlehbcgziqgoqqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580946.4221358-202-42183785616875/AnsiballZ_copy.py'
Dec 01 09:22:27 compute-0 sudo[142111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:27 compute-0 sudo[141949]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:22:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:22:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:22:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:22:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:22:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:22:27 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 8821ad64-a99e-42f4-82c4-2c1991d750fe does not exist
Dec 01 09:22:27 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 4331d7c3-f957-4659-b3c6-1932c1608ea8 does not exist
Dec 01 09:22:27 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 2fbe7e33-cf3d-46b4-9c8e-efa2c206b754 does not exist
Dec 01 09:22:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:22:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:22:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:22:27 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:22:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:22:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:22:27 compute-0 sudo[142128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:27 compute-0 sudo[142128]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:27 compute-0 sudo[142128]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 python3.9[142113]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580946.4221358-202-42183785616875/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:27 compute-0 sudo[142111]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 sudo[142153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:22:27 compute-0 sudo[142153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:27 compute-0 sudo[142153]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 sudo[142195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:27 compute-0 sudo[142195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:27 compute-0 sudo[142195]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:27 compute-0 sudo[142228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:22:27 compute-0 sudo[142228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:28 compute-0 podman[142381]: 2025-12-01 09:22:28.31688917 +0000 UTC m=+0.049867447 container create c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:22:28 compute-0 systemd[1]: Started libpod-conmon-c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31.scope.
Dec 01 09:22:28 compute-0 sudo[142428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifohhbhjlaqtbngomviuyewspwtcyruc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580947.9659944-217-195359211251029/AnsiballZ_stat.py'
Dec 01 09:22:28 compute-0 sudo[142428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:28 compute-0 podman[142381]: 2025-12-01 09:22:28.29611235 +0000 UTC m=+0.029090657 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:22:28 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:22:28 compute-0 podman[142381]: 2025-12-01 09:22:28.423041965 +0000 UTC m=+0.156020252 container init c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True)
Dec 01 09:22:28 compute-0 podman[142381]: 2025-12-01 09:22:28.433586555 +0000 UTC m=+0.166564852 container start c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:22:28 compute-0 podman[142381]: 2025-12-01 09:22:28.437530867 +0000 UTC m=+0.170509164 container attach c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:22:28 compute-0 laughing_lovelace[142432]: 167 167
Dec 01 09:22:28 compute-0 systemd[1]: libpod-c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31.scope: Deactivated successfully.
Dec 01 09:22:28 compute-0 podman[142381]: 2025-12-01 09:22:28.444554906 +0000 UTC m=+0.177533183 container died c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:22:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-9a3a712a3db0981c066aef77fef4972d490272ac4d65e20de91f76144bdcd027-merged.mount: Deactivated successfully.
Dec 01 09:22:28 compute-0 podman[142381]: 2025-12-01 09:22:28.499173478 +0000 UTC m=+0.232151745 container remove c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_lovelace, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:22:28 compute-0 systemd[1]: libpod-conmon-c8aed5416b546c8bc7287bae560158ebe64811bcbfcf1e9735818e6c5c0f7c31.scope: Deactivated successfully.
Dec 01 09:22:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:28 compute-0 python3.9[142434]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:28 compute-0 sudo[142428]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:28 compute-0 ceph-mon[75031]: pgmap v315: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:22:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:22:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:22:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:22:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:22:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:22:28 compute-0 podman[142459]: 2025-12-01 09:22:28.673455617 +0000 UTC m=+0.044112873 container create 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:22:28 compute-0 systemd[1]: Started libpod-conmon-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope.
Dec 01 09:22:28 compute-0 podman[142459]: 2025-12-01 09:22:28.651464233 +0000 UTC m=+0.022121489 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:22:28 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:22:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:28 compute-0 podman[142459]: 2025-12-01 09:22:28.785188491 +0000 UTC m=+0.155845767 container init 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:22:28 compute-0 podman[142459]: 2025-12-01 09:22:28.798701995 +0000 UTC m=+0.169359231 container start 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:22:28 compute-0 podman[142459]: 2025-12-01 09:22:28.802154723 +0000 UTC m=+0.172811979 container attach 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:22:28 compute-0 sudo[142601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rklzpltbzwlguzgxuhhilfgyrcjhwdkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580947.9659944-217-195359211251029/AnsiballZ_copy.py'
Dec 01 09:22:28 compute-0 sudo[142601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:29 compute-0 python3.9[142603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764580947.9659944-217-195359211251029/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:29 compute-0 sudo[142601]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v316: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:29 compute-0 sudo[142766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyofslrjdlempqkmeymwbhtnrqresogr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580949.3693528-232-104234090836752/AnsiballZ_file.py'
Dec 01 09:22:29 compute-0 sudo[142766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:29 compute-0 affectionate_kilby[142506]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:22:29 compute-0 affectionate_kilby[142506]: --> relative data size: 1.0
Dec 01 09:22:29 compute-0 affectionate_kilby[142506]: --> All data devices are unavailable
Dec 01 09:22:29 compute-0 python3.9[142769]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:29 compute-0 sudo[142766]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:29 compute-0 systemd[1]: libpod-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope: Deactivated successfully.
Dec 01 09:22:29 compute-0 systemd[1]: libpod-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope: Consumed 1.034s CPU time.
Dec 01 09:22:29 compute-0 podman[142459]: 2025-12-01 09:22:29.893317553 +0000 UTC m=+1.263974779 container died 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:22:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-02e9a3fda8131a5b1317ed5055fb208c60e7b084e6d674b873db6747dd81020f-merged.mount: Deactivated successfully.
Dec 01 09:22:29 compute-0 podman[142459]: 2025-12-01 09:22:29.952217026 +0000 UTC m=+1.322874252 container remove 6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_kilby, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:22:29 compute-0 systemd[1]: libpod-conmon-6bb454caad6554abf4d6cd46be8b988a34d0af910b81bf27cbd990f8dd6e8a19.scope: Deactivated successfully.
Dec 01 09:22:29 compute-0 sudo[142228]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:30 compute-0 sudo[142817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:30 compute-0 sudo[142817]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:30 compute-0 sudo[142817]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:30 compute-0 sudo[142842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:22:30 compute-0 sudo[142842]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:30 compute-0 sudo[142842]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:30 compute-0 sudo[142867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:30 compute-0 sudo[142867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:30 compute-0 sudo[142867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:30 compute-0 sudo[142895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:22:30 compute-0 sudo[142895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:30 compute-0 sudo[143073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efcbatkluukpwijwhattupsaobhdbzfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580950.2429993-240-260758162328959/AnsiballZ_command.py'
Dec 01 09:22:30 compute-0 sudo[143073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:30 compute-0 ceph-mon[75031]: pgmap v316: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:30 compute-0 podman[143083]: 2025-12-01 09:22:30.679323427 +0000 UTC m=+0.064914485 container create 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:22:30 compute-0 systemd[1]: Started libpod-conmon-661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509.scope.
Dec 01 09:22:30 compute-0 python3.9[143080]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:22:30 compute-0 podman[143083]: 2025-12-01 09:22:30.658274619 +0000 UTC m=+0.043865737 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:22:30 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:22:30 compute-0 podman[143083]: 2025-12-01 09:22:30.769438666 +0000 UTC m=+0.155029754 container init 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:22:30 compute-0 podman[143083]: 2025-12-01 09:22:30.779729068 +0000 UTC m=+0.165320136 container start 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:22:30 compute-0 podman[143083]: 2025-12-01 09:22:30.783752733 +0000 UTC m=+0.169343821 container attach 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:22:30 compute-0 elegant_moore[143100]: 167 167
Dec 01 09:22:30 compute-0 systemd[1]: libpod-661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509.scope: Deactivated successfully.
Dec 01 09:22:30 compute-0 podman[143083]: 2025-12-01 09:22:30.787435207 +0000 UTC m=+0.173026295 container died 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:22:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-4fbed17bb650c4d9d10a134138c56c7a4acadfc9db762c6829ca47d53daeec1f-merged.mount: Deactivated successfully.
Dec 01 09:22:30 compute-0 sudo[143073]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:30 compute-0 podman[143083]: 2025-12-01 09:22:30.82662144 +0000 UTC m=+0.212212518 container remove 661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_moore, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:22:30 compute-0 systemd[1]: libpod-conmon-661ad48fe5e4bbe5a7d276beb772954aa48e3cf1ef648382ee6d279a48e3f509.scope: Deactivated successfully.
Dec 01 09:22:31 compute-0 podman[143151]: 2025-12-01 09:22:31.004329927 +0000 UTC m=+0.066422707 container create e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:22:31 compute-0 podman[143151]: 2025-12-01 09:22:30.971528896 +0000 UTC m=+0.033621726 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:22:31 compute-0 systemd[1]: Started libpod-conmon-e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd.scope.
Dec 01 09:22:31 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:22:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:31 compute-0 podman[143151]: 2025-12-01 09:22:31.121264939 +0000 UTC m=+0.183357739 container init e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:22:31 compute-0 podman[143151]: 2025-12-01 09:22:31.133386253 +0000 UTC m=+0.195479033 container start e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:22:31 compute-0 podman[143151]: 2025-12-01 09:22:31.138349234 +0000 UTC m=+0.200442014 container attach e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:22:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v317: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:31 compute-0 sudo[143298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkmornitlmawtlckaerzjcgwpfrnqmxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580950.9859407-248-149566981264632/AnsiballZ_blockinfile.py'
Dec 01 09:22:31 compute-0 sudo[143298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:31 compute-0 python3.9[143300]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:31 compute-0 sudo[143298]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]: {
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:     "0": [
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:         {
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "devices": [
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "/dev/loop3"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             ],
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_name": "ceph_lv0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_size": "21470642176",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "name": "ceph_lv0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "tags": {
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cluster_name": "ceph",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.crush_device_class": "",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.encrypted": "0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osd_id": "0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.type": "block",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.vdo": "0"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             },
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "type": "block",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "vg_name": "ceph_vg0"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:         }
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:     ],
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:     "1": [
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:         {
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "devices": [
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "/dev/loop4"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             ],
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_name": "ceph_lv1",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_size": "21470642176",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "name": "ceph_lv1",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "tags": {
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cluster_name": "ceph",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.crush_device_class": "",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.encrypted": "0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osd_id": "1",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.type": "block",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.vdo": "0"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             },
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "type": "block",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "vg_name": "ceph_vg1"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:         }
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:     ],
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:     "2": [
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:         {
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "devices": [
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "/dev/loop5"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             ],
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_name": "ceph_lv2",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_size": "21470642176",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "name": "ceph_lv2",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "tags": {
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.cluster_name": "ceph",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.crush_device_class": "",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.encrypted": "0",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osd_id": "2",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.type": "block",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:                 "ceph.vdo": "0"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             },
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "type": "block",
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:             "vg_name": "ceph_vg2"
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:         }
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]:     ]
Dec 01 09:22:31 compute-0 lucid_dubinsky[143217]: }
Dec 01 09:22:32 compute-0 systemd[1]: libpod-e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd.scope: Deactivated successfully.
Dec 01 09:22:32 compute-0 podman[143151]: 2025-12-01 09:22:32.017223004 +0000 UTC m=+1.079315784 container died e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:22:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-564f25c2f1fc0267c9782b035d09de309bccbae5aa761b887e549893ddca620a-merged.mount: Deactivated successfully.
Dec 01 09:22:32 compute-0 podman[143151]: 2025-12-01 09:22:32.080563843 +0000 UTC m=+1.142656593 container remove e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_dubinsky, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:22:32 compute-0 systemd[1]: libpod-conmon-e0b41ed2bbe875c34a019df886b62fb73cf4b996c5ceff26aad50e0418f39afd.scope: Deactivated successfully.
Dec 01 09:22:32 compute-0 sudo[142895]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:32 compute-0 sudo[143418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:32 compute-0 sudo[143418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:32 compute-0 sudo[143418]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:32 compute-0 sudo[143516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnhbyzkawdscbfehsyhuaewmqqysiltq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580951.969146-257-78367203200687/AnsiballZ_command.py'
Dec 01 09:22:32 compute-0 sudo[143516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:32 compute-0 sudo[143471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:22:32 compute-0 sudo[143471]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:32 compute-0 sudo[143471]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:32 compute-0 sudo[143521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:32 compute-0 sudo[143521]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:32 compute-0 sudo[143521]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:32 compute-0 sudo[143546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:22:32 compute-0 sudo[143546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:32 compute-0 python3.9[143519]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:22:32 compute-0 sudo[143516]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:32 compute-0 ceph-mon[75031]: pgmap v317: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:32 compute-0 podman[143689]: 2025-12-01 09:22:32.822546376 +0000 UTC m=+0.047363936 container create 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:22:32 compute-0 systemd[1]: Started libpod-conmon-352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0.scope.
Dec 01 09:22:32 compute-0 podman[143689]: 2025-12-01 09:22:32.799886823 +0000 UTC m=+0.024704423 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:22:32 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:22:32 compute-0 podman[143689]: 2025-12-01 09:22:32.919372816 +0000 UTC m=+0.144190396 container init 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:22:32 compute-0 podman[143689]: 2025-12-01 09:22:32.933148697 +0000 UTC m=+0.157966257 container start 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:22:32 compute-0 podman[143689]: 2025-12-01 09:22:32.938579812 +0000 UTC m=+0.163397382 container attach 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:22:32 compute-0 sweet_benz[143732]: 167 167
Dec 01 09:22:32 compute-0 systemd[1]: libpod-352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0.scope: Deactivated successfully.
Dec 01 09:22:32 compute-0 podman[143689]: 2025-12-01 09:22:32.940816105 +0000 UTC m=+0.165633685 container died 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:22:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-772545479a02b333d6b8575f6da5bd9187e04118edffeba56fe572eac2b9a9da-merged.mount: Deactivated successfully.
Dec 01 09:22:32 compute-0 podman[143689]: 2025-12-01 09:22:32.983988471 +0000 UTC m=+0.208806021 container remove 352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_benz, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:22:32 compute-0 systemd[1]: libpod-conmon-352c5f8a7d95e6c7d78531f35c68fb45b9c7cd953c99ad28c0784a8f057a53e0.scope: Deactivated successfully.
Dec 01 09:22:33 compute-0 sudo[143793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvheyflzwdmkvmtomjldorfhvoyfzccc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580952.691265-265-276562716356936/AnsiballZ_stat.py'
Dec 01 09:22:33 compute-0 sudo[143793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:33 compute-0 podman[143803]: 2025-12-01 09:22:33.168284546 +0000 UTC m=+0.048028656 container create dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:22:33 compute-0 python3.9[143797]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:22:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v318: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:33 compute-0 sudo[143793]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:33 compute-0 systemd[1]: Started libpod-conmon-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope.
Dec 01 09:22:33 compute-0 podman[143803]: 2025-12-01 09:22:33.149768 +0000 UTC m=+0.029512130 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:22:33 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:22:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:22:33 compute-0 podman[143803]: 2025-12-01 09:22:33.310144485 +0000 UTC m=+0.189888655 container init dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:22:33 compute-0 podman[143803]: 2025-12-01 09:22:33.317179714 +0000 UTC m=+0.196923834 container start dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:22:33 compute-0 podman[143803]: 2025-12-01 09:22:33.320454887 +0000 UTC m=+0.200199037 container attach dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:22:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:33 compute-0 sudo[143975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgaopvkemddimscfocqnjladpoujyvwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580953.4072416-273-119285765322596/AnsiballZ_command.py'
Dec 01 09:22:33 compute-0 sudo[143975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:33 compute-0 python3.9[143977]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:22:33 compute-0 sudo[143975]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:34 compute-0 wonderful_gates[143821]: {
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "osd_id": 0,
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "type": "bluestore"
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:     },
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "osd_id": 1,
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "type": "bluestore"
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:     },
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "osd_id": 2,
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:         "type": "bluestore"
Dec 01 09:22:34 compute-0 wonderful_gates[143821]:     }
Dec 01 09:22:34 compute-0 wonderful_gates[143821]: }
Dec 01 09:22:34 compute-0 systemd[1]: libpod-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope: Deactivated successfully.
Dec 01 09:22:34 compute-0 systemd[1]: libpod-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope: Consumed 1.019s CPU time.
Dec 01 09:22:34 compute-0 podman[143803]: 2025-12-01 09:22:34.334825327 +0000 UTC m=+1.214569517 container died dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:22:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-46b9484028f687832b784273f9e00efbe969116e8d436843163a6ea681366768-merged.mount: Deactivated successfully.
Dec 01 09:22:34 compute-0 podman[143803]: 2025-12-01 09:22:34.402277063 +0000 UTC m=+1.282021173 container remove dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_gates, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec 01 09:22:34 compute-0 systemd[1]: libpod-conmon-dae1e5d15c62a6496f67a6a0ccf7d331e0d7710f663b1b715081086342a20eb4.scope: Deactivated successfully.
Dec 01 09:22:34 compute-0 sudo[143546]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:22:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:22:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:22:34 compute-0 sudo[144172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiexyqizmujdzpdrwzrminqiiutqwbxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580954.1600008-281-12201719715936/AnsiballZ_file.py'
Dec 01 09:22:34 compute-0 sudo[144172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:34 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:22:34 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev fa9d40c6-08b8-4b3e-933d-60633a99e5ab does not exist
Dec 01 09:22:34 compute-0 sudo[144175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:22:34 compute-0 sudo[144175]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:34 compute-0 sudo[144175]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:34 compute-0 sudo[144200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:22:34 compute-0 sudo[144200]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:22:34 compute-0 sudo[144200]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:34 compute-0 python3.9[144174]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:34 compute-0 ceph-mon[75031]: pgmap v318: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:34 compute-0 sudo[144172]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:22:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:22:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v319: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:35 compute-0 python3.9[144374]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:22:36 compute-0 ceph-mon[75031]: pgmap v319: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:36 compute-0 sudo[144525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yprxqmlmuoilcrkemxiqpmyfurznsnfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580956.4694152-321-49613461349957/AnsiballZ_command.py'
Dec 01 09:22:36 compute-0 sudo[144525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:37 compute-0 python3.9[144527]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:22:37 compute-0 ovs-vsctl[144528]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 01 09:22:37 compute-0 sudo[144525]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v320: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:37 compute-0 sudo[144678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elxkyhkkaibubpdmhvkfsvisdjoaaqyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580957.3893168-330-212779435838149/AnsiballZ_command.py'
Dec 01 09:22:37 compute-0 sudo[144678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:37 compute-0 python3.9[144680]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:22:37 compute-0 sudo[144678]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:38 compute-0 sudo[144833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daurlfnvqkmuhtxngklmpcdsqirasknw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580958.0633602-338-231370316816843/AnsiballZ_command.py'
Dec 01 09:22:38 compute-0 sudo[144833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:38 compute-0 python3.9[144835]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:22:38 compute-0 ovs-vsctl[144836]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 01 09:22:38 compute-0 sudo[144833]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:38 compute-0 ceph-mon[75031]: pgmap v320: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v321: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:39 compute-0 python3.9[144986]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:22:40 compute-0 sudo[145138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etnfaobsbfqxqyqvfniqpliwvbtjkgtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580959.7138705-355-38838127437620/AnsiballZ_file.py'
Dec 01 09:22:40 compute-0 sudo[145138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:40 compute-0 python3.9[145140]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:40 compute-0 sudo[145138]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:40 compute-0 ceph-mon[75031]: pgmap v321: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:40 compute-0 sudo[145290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymvpmeofnibuvmoevlhwgxaroogmkamb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580960.486728-363-48374479477804/AnsiballZ_stat.py'
Dec 01 09:22:40 compute-0 sudo[145290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:41 compute-0 python3.9[145292]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:41 compute-0 sudo[145290]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v322: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:41 compute-0 sudo[145368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uulkrzcejgusqzzuavmiiwwoncwvagtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580960.486728-363-48374479477804/AnsiballZ_file.py'
Dec 01 09:22:41 compute-0 sudo[145368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:41 compute-0 python3.9[145370]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:41 compute-0 sudo[145368]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:41 compute-0 sudo[145520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxunfdjxhvvzxgnrmbkhwxetqmxcxmzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580961.6099038-363-108011394925827/AnsiballZ_stat.py'
Dec 01 09:22:41 compute-0 sudo[145520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:42 compute-0 python3.9[145522]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:42 compute-0 sudo[145520]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:42 compute-0 sudo[145598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwjeujvifwckzzrjkxuiynyjvfwnfpna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580961.6099038-363-108011394925827/AnsiballZ_file.py'
Dec 01 09:22:42 compute-0 sudo[145598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:42 compute-0 python3.9[145600]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:42 compute-0 sudo[145598]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:42 compute-0 ceph-mon[75031]: pgmap v322: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:22:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:22:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:22:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:22:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:22:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:22:43 compute-0 sudo[145750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amoeprotwvclxrvahhlsfwldajsjiekd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580962.751717-386-195368310267033/AnsiballZ_file.py'
Dec 01 09:22:43 compute-0 sudo[145750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v323: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:43 compute-0 python3.9[145752]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:43 compute-0 sudo[145750]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:43 compute-0 sudo[145902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdewjsebkfjyaidssqmaykqvvlarlnrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580963.4898345-394-271191989306132/AnsiballZ_stat.py'
Dec 01 09:22:43 compute-0 sudo[145902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:44 compute-0 python3.9[145904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:44 compute-0 sudo[145902]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:44 compute-0 sudo[145980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sifbbwqcnxokcqpmzyphotgkwrwsrgkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580963.4898345-394-271191989306132/AnsiballZ_file.py'
Dec 01 09:22:44 compute-0 sudo[145980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:44 compute-0 python3.9[145982]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:44 compute-0 sudo[145980]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:44 compute-0 ceph-mon[75031]: pgmap v323: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:45 compute-0 sudo[146132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csdqclsynrxxmmadydhbvbbfznznrvzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580964.7384653-406-193585483194631/AnsiballZ_stat.py'
Dec 01 09:22:45 compute-0 sudo[146132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v324: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:45 compute-0 python3.9[146134]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:45 compute-0 sudo[146132]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:45 compute-0 sudo[146210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdxmstnrvrdjafrkeacvxdzcywwksobu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580964.7384653-406-193585483194631/AnsiballZ_file.py'
Dec 01 09:22:45 compute-0 sudo[146210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:45 compute-0 python3.9[146212]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:45 compute-0 sudo[146210]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:46 compute-0 sudo[146362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfciwxmoonpbzusctcjvxastsarlexqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580966.0163352-418-25524809222629/AnsiballZ_systemd.py'
Dec 01 09:22:46 compute-0 sudo[146362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:46 compute-0 python3.9[146364]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:22:46 compute-0 systemd[1]: Reloading.
Dec 01 09:22:46 compute-0 systemd-rc-local-generator[146387]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:22:46 compute-0 systemd-sysv-generator[146391]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:22:46 compute-0 sudo[146362]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:47 compute-0 ceph-mon[75031]: pgmap v324: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v325: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:47 compute-0 sudo[146551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrrpnmzrkzlpcnfdwgfdiqdneeztnwwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580967.1604955-426-77563312988616/AnsiballZ_stat.py'
Dec 01 09:22:47 compute-0 sudo[146551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:47 compute-0 python3.9[146553]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:47 compute-0 sudo[146551]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:47 compute-0 sudo[146629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxcjmqscxzuvsqhhtwdmturrszkibyqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580967.1604955-426-77563312988616/AnsiballZ_file.py'
Dec 01 09:22:47 compute-0 sudo[146629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:48 compute-0 python3.9[146631]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:48 compute-0 sudo[146629]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:48 compute-0 sudo[146781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijxyopiaewalcxqyhibhakfmwzqxhebf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580968.408213-438-133283715864799/AnsiballZ_stat.py'
Dec 01 09:22:48 compute-0 sudo[146781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:48 compute-0 python3.9[146783]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:48 compute-0 sudo[146781]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:49 compute-0 ceph-mon[75031]: pgmap v325: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:49 compute-0 sudo[146859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsepawokieffbwbmcqjhwqvlrbjllarn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580968.408213-438-133283715864799/AnsiballZ_file.py'
Dec 01 09:22:49 compute-0 sudo[146859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v326: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:49 compute-0 python3.9[146861]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:49 compute-0 sudo[146859]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:49 compute-0 sudo[147011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lueyxvdtiwofwhhsiozqscolaijubrjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580969.554995-450-217354981031885/AnsiballZ_systemd.py'
Dec 01 09:22:49 compute-0 sudo[147011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:50 compute-0 python3.9[147013]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:22:50 compute-0 systemd[1]: Reloading.
Dec 01 09:22:50 compute-0 systemd-rc-local-generator[147032]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:22:50 compute-0 systemd-sysv-generator[147039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:22:50 compute-0 systemd[1]: Starting Create netns directory...
Dec 01 09:22:50 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:22:50 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:22:50 compute-0 systemd[1]: Finished Create netns directory.
Dec 01 09:22:50 compute-0 sudo[147011]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:51 compute-0 ceph-mon[75031]: pgmap v326: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:51 compute-0 sudo[147204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqfvhsddsmqthkdpmupibhapskibcjpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580970.945697-460-242849494640676/AnsiballZ_file.py'
Dec 01 09:22:51 compute-0 sudo[147204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v327: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:51 compute-0 python3.9[147206]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:51 compute-0 sudo[147204]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:51 compute-0 sudo[147356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtbfinamgqjevqqgtergktawkfnuffhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580971.641556-468-223178151653433/AnsiballZ_stat.py'
Dec 01 09:22:51 compute-0 sudo[147356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:52 compute-0 python3.9[147358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:52 compute-0 sudo[147356]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:52 compute-0 sudo[147479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nszsaecdibzrfjkyurwvjhasqfdydqsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580971.641556-468-223178151653433/AnsiballZ_copy.py'
Dec 01 09:22:52 compute-0 sudo[147479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:52 compute-0 python3.9[147481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764580971.641556-468-223178151653433/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:52 compute-0 sudo[147479]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:53 compute-0 ceph-mon[75031]: pgmap v327: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v328: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:53 compute-0 sudo[147631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piscbeqjirtzsuhccgwemnwwnaubytcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580973.081303-485-204618094369341/AnsiballZ_file.py'
Dec 01 09:22:53 compute-0 sudo[147631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:53 compute-0 python3.9[147633]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:22:53 compute-0 sudo[147631]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:54 compute-0 sudo[147783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kimikkjdnjzgfdpizqqfyfpvaaazgepk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580973.8409085-493-19042733144268/AnsiballZ_stat.py'
Dec 01 09:22:54 compute-0 sudo[147783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:54 compute-0 python3.9[147785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:22:54 compute-0 sudo[147783]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:54 compute-0 sudo[147906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svyhpuclvcqnmdvwprkgtcsibvjrcaeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580973.8409085-493-19042733144268/AnsiballZ_copy.py'
Dec 01 09:22:54 compute-0 sudo[147906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:54 compute-0 python3.9[147908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580973.8409085-493-19042733144268/.source.json _original_basename=.debfnmn8 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:54 compute-0 sudo[147906]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:55 compute-0 ceph-mon[75031]: pgmap v328: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v329: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:55 compute-0 sudo[148058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbawkflclovgdcajyrmscyvrdzszvred ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580975.0230813-508-142313794697486/AnsiballZ_file.py'
Dec 01 09:22:55 compute-0 sudo[148058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:55 compute-0 python3.9[148060]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:22:55 compute-0 sudo[148058]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:56 compute-0 sudo[148210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvhtepzndgqevamriqswaohgwwkysvzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580975.7306273-516-154606369329593/AnsiballZ_stat.py'
Dec 01 09:22:56 compute-0 sudo[148210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:56 compute-0 sudo[148210]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:56 compute-0 sudo[148333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rseeyejlyqxuefsxebpyfoywqbzwcljs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580975.7306273-516-154606369329593/AnsiballZ_copy.py'
Dec 01 09:22:56 compute-0 sudo[148333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:56 compute-0 sudo[148333]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:57 compute-0 ceph-mon[75031]: pgmap v329: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v330: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:57 compute-0 sudo[148485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifijiiikvfgpowqttzzbwccjswrtjwut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580977.1131592-533-92392381265294/AnsiballZ_container_config_data.py'
Dec 01 09:22:57 compute-0 sudo[148485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:57 compute-0 python3.9[148487]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 01 09:22:57 compute-0 sudo[148485]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:58 compute-0 sudo[148637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikhmtboperkarewbshaldxznshwfxxiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580977.9823594-542-273420281260579/AnsiballZ_container_config_hash.py'
Dec 01 09:22:58 compute-0 sudo[148637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:22:58 compute-0 python3.9[148639]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 09:22:58 compute-0 sudo[148637]: pam_unix(sudo:session): session closed for user root
Dec 01 09:22:59 compute-0 ceph-mon[75031]: pgmap v330: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v331: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:22:59 compute-0 sudo[148789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvawntjangdryojjelckgzxfiphzdiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580978.8661768-551-154941866395450/AnsiballZ_podman_container_info.py'
Dec 01 09:22:59 compute-0 sudo[148789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:22:59 compute-0 python3.9[148791]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 01 09:22:59 compute-0 sudo[148789]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:00 compute-0 sudo[148968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkloikgwybmbrjiscnaqmizkalbvxrhn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764580980.2802298-564-173289319962481/AnsiballZ_edpm_container_manage.py'
Dec 01 09:23:00 compute-0 sudo[148968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:01 compute-0 python3[148970]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 09:23:01 compute-0 ceph-mon[75031]: pgmap v331: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v332: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:03 compute-0 ceph-mon[75031]: pgmap v332: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v333: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:05 compute-0 ceph-mon[75031]: pgmap v333: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v334: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:07 compute-0 podman[148982]: 2025-12-01 09:23:07.003057856 +0000 UTC m=+5.876582464 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 01 09:23:07 compute-0 ceph-mon[75031]: pgmap v334: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:07 compute-0 podman[149099]: 2025-12-01 09:23:07.207889686 +0000 UTC m=+0.075439288 container create 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:23:07 compute-0 podman[149099]: 2025-12-01 09:23:07.174451661 +0000 UTC m=+0.042001273 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 01 09:23:07 compute-0 python3[148970]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 01 09:23:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v335: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:07 compute-0 sudo[148968]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:07 compute-0 sudo[149286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzfknrhlrpzvhwnwiozbayobnwdbubia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580987.5681338-572-202258190898830/AnsiballZ_stat.py'
Dec 01 09:23:07 compute-0 sudo[149286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:08 compute-0 python3.9[149288]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:23:08 compute-0 sudo[149286]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:08 compute-0 sudo[149440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmkwqiojdivxjlpigndqzsyijttptzoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580988.3467402-581-158639874279052/AnsiballZ_file.py'
Dec 01 09:23:08 compute-0 sudo[149440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:08 compute-0 python3.9[149442]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:08 compute-0 sudo[149440]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:09 compute-0 sudo[149516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkcgvfjqikhlyiihuyibqsaopaeasilz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580988.3467402-581-158639874279052/AnsiballZ_stat.py'
Dec 01 09:23:09 compute-0 sudo[149516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:09 compute-0 ceph-mon[75031]: pgmap v335: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v336: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:09 compute-0 python3.9[149518]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:23:09 compute-0 sudo[149516]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:09 compute-0 sudo[149667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxkxygwutyuqmomvfwooksytmiemsyve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580989.3717155-581-123960181093906/AnsiballZ_copy.py'
Dec 01 09:23:09 compute-0 sudo[149667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:10 compute-0 python3.9[149669]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580989.3717155-581-123960181093906/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:10 compute-0 sudo[149667]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:10 compute-0 sudo[149743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpnzurfrsivpzfuayjoflrtleudhkfzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580989.3717155-581-123960181093906/AnsiballZ_systemd.py'
Dec 01 09:23:10 compute-0 sudo[149743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:10 compute-0 ceph-mon[75031]: pgmap v336: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:10 compute-0 python3.9[149745]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:23:10 compute-0 systemd[1]: Reloading.
Dec 01 09:23:10 compute-0 systemd-sysv-generator[149774]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:23:10 compute-0 systemd-rc-local-generator[149771]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:23:11 compute-0 sudo[149743]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v337: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:11 compute-0 sudo[149855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utqhnrkfaxwuybiwlifsoqewdrddeabo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580989.3717155-581-123960181093906/AnsiballZ_systemd.py'
Dec 01 09:23:11 compute-0 sudo[149855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:11 compute-0 python3.9[149857]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:23:11 compute-0 systemd[1]: Reloading.
Dec 01 09:23:11 compute-0 systemd-rc-local-generator[149889]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:23:11 compute-0 systemd-sysv-generator[149893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:23:12 compute-0 systemd[1]: Starting ovn_controller container...
Dec 01 09:23:12 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:23:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b77894e7671e5f816c1686647ee4a2e892983fd7b53971baf083a85c34a46778/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:12 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c.
Dec 01 09:23:12 compute-0 podman[149898]: 2025-12-01 09:23:12.208668003 +0000 UTC m=+0.131907578 container init 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + sudo -E kolla_set_configs
Dec 01 09:23:12 compute-0 podman[149898]: 2025-12-01 09:23:12.232863692 +0000 UTC m=+0.156103247 container start 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 09:23:12 compute-0 edpm-start-podman-container[149898]: ovn_controller
Dec 01 09:23:12 compute-0 systemd[1]: Created slice User Slice of UID 0.
Dec 01 09:23:12 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 01 09:23:12 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 01 09:23:12 compute-0 edpm-start-podman-container[149897]: Creating additional drop-in dependency for "ovn_controller" (34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c)
Dec 01 09:23:12 compute-0 systemd[1]: Starting User Manager for UID 0...
Dec 01 09:23:12 compute-0 podman[149921]: 2025-12-01 09:23:12.312076528 +0000 UTC m=+0.066013856 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 01 09:23:12 compute-0 systemd[149955]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 01 09:23:12 compute-0 systemd[1]: 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c-968c33bf7dddd62.service: Main process exited, code=exited, status=1/FAILURE
Dec 01 09:23:12 compute-0 systemd[1]: 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c-968c33bf7dddd62.service: Failed with result 'exit-code'.
Dec 01 09:23:12 compute-0 systemd[1]: Reloading.
Dec 01 09:23:12 compute-0 systemd-sysv-generator[150005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:23:12 compute-0 systemd-rc-local-generator[150002]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:23:12 compute-0 ceph-mon[75031]: pgmap v337: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:12 compute-0 systemd[149955]: Queued start job for default target Main User Target.
Dec 01 09:23:12 compute-0 systemd[149955]: Created slice User Application Slice.
Dec 01 09:23:12 compute-0 systemd[149955]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 01 09:23:12 compute-0 systemd[149955]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 09:23:12 compute-0 systemd[149955]: Reached target Paths.
Dec 01 09:23:12 compute-0 systemd[149955]: Reached target Timers.
Dec 01 09:23:12 compute-0 systemd[149955]: Starting D-Bus User Message Bus Socket...
Dec 01 09:23:12 compute-0 systemd[149955]: Starting Create User's Volatile Files and Directories...
Dec 01 09:23:12 compute-0 systemd[149955]: Listening on D-Bus User Message Bus Socket.
Dec 01 09:23:12 compute-0 systemd[149955]: Reached target Sockets.
Dec 01 09:23:12 compute-0 systemd[149955]: Finished Create User's Volatile Files and Directories.
Dec 01 09:23:12 compute-0 systemd[149955]: Reached target Basic System.
Dec 01 09:23:12 compute-0 systemd[149955]: Reached target Main User Target.
Dec 01 09:23:12 compute-0 systemd[149955]: Startup finished in 155ms.
Dec 01 09:23:12 compute-0 systemd[1]: Started User Manager for UID 0.
Dec 01 09:23:12 compute-0 systemd[1]: Started Session c1 of User root.
Dec 01 09:23:12 compute-0 systemd[1]: Started ovn_controller container.
Dec 01 09:23:12 compute-0 sudo[149855]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:12 compute-0 ovn_controller[149914]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 09:23:12 compute-0 ovn_controller[149914]: INFO:__main__:Validating config file
Dec 01 09:23:12 compute-0 ovn_controller[149914]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 09:23:12 compute-0 ovn_controller[149914]: INFO:__main__:Writing out command to execute
Dec 01 09:23:12 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: ++ cat /run_command
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + ARGS=
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + sudo kolla_copy_cacerts
Dec 01 09:23:12 compute-0 systemd[1]: Started Session c2 of User root.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + [[ ! -n '' ]]
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + . kolla_extend_start
Dec 01 09:23:12 compute-0 ovn_controller[149914]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + umask 0022
Dec 01 09:23:12 compute-0 ovn_controller[149914]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 01 09:23:12 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8059] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8066] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8076] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8080] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8083] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 01 09:23:12 compute-0 kernel: br-int: entered promiscuous mode
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:23:12 compute-0 ovn_controller[149914]: 2025-12-01T09:23:12Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8281] manager: (ovn-7cec17-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 01 09:23:12 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8487] device (genev_sys_6081): carrier: link connected
Dec 01 09:23:12 compute-0 NetworkManager[48954]: <info>  [1764580992.8490] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec 01 09:23:12 compute-0 systemd-udevd[150076]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:23:12 compute-0 systemd-udevd[150072]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:23:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:23:12
Dec 01 09:23:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:23:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:23:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['volumes', 'cephfs.cephfs.data', 'vms', 'images', 'backups', 'cephfs.cephfs.meta', '.mgr']
Dec 01 09:23:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:23:13 compute-0 sudo[150181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiwbyupvzvdfmongoqdyslmedvgbrnrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580992.847519-609-201520634333866/AnsiballZ_command.py'
Dec 01 09:23:13 compute-0 sudo[150181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v338: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:13 compute-0 python3.9[150183]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:23:13 compute-0 ovs-vsctl[150184]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 01 09:23:13 compute-0 sudo[150181]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:13 compute-0 sudo[150334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pokfolwfsdmhohnqgacafgznfwcuuflg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580993.4976509-617-214352886421869/AnsiballZ_command.py'
Dec 01 09:23:13 compute-0 sudo[150334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:13 compute-0 python3.9[150336]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:23:13 compute-0 ovs-vsctl[150338]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 01 09:23:14 compute-0 sudo[150334]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:14 compute-0 ceph-mon[75031]: pgmap v338: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:14 compute-0 sudo[150489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrkxsrexlrvxerbrpmpsaxdhhpchhays ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764580994.3688433-631-116418917685072/AnsiballZ_command.py'
Dec 01 09:23:14 compute-0 sudo[150489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:14 compute-0 python3.9[150491]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:23:14 compute-0 ovs-vsctl[150492]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 01 09:23:14 compute-0 sudo[150489]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v339: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:15 compute-0 sshd-session[138433]: Connection closed by 192.168.122.30 port 60188
Dec 01 09:23:15 compute-0 sshd-session[138430]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:23:15 compute-0 systemd[1]: session-46.scope: Deactivated successfully.
Dec 01 09:23:15 compute-0 systemd[1]: session-46.scope: Consumed 1min 3.147s CPU time.
Dec 01 09:23:15 compute-0 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Dec 01 09:23:15 compute-0 systemd-logind[788]: Removed session 46.
Dec 01 09:23:16 compute-0 ceph-mon[75031]: pgmap v339: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v340: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:23:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:23:18 compute-0 ceph-mon[75031]: pgmap v340: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v341: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:20 compute-0 ceph-mon[75031]: pgmap v341: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v342: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:21 compute-0 sshd-session[150517]: Accepted publickey for zuul from 192.168.122.30 port 38336 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:23:21 compute-0 systemd-logind[788]: New session 48 of user zuul.
Dec 01 09:23:21 compute-0 systemd[1]: Started Session 48 of User zuul.
Dec 01 09:23:21 compute-0 sshd-session[150517]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:23:22 compute-0 python3.9[150670]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:23:22 compute-0 ceph-mon[75031]: pgmap v342: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:22 compute-0 systemd[1]: Stopping User Manager for UID 0...
Dec 01 09:23:22 compute-0 systemd[149955]: Activating special unit Exit the Session...
Dec 01 09:23:22 compute-0 systemd[149955]: Stopped target Main User Target.
Dec 01 09:23:22 compute-0 systemd[149955]: Stopped target Basic System.
Dec 01 09:23:22 compute-0 systemd[149955]: Stopped target Paths.
Dec 01 09:23:22 compute-0 systemd[149955]: Stopped target Sockets.
Dec 01 09:23:22 compute-0 systemd[149955]: Stopped target Timers.
Dec 01 09:23:22 compute-0 systemd[149955]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 01 09:23:22 compute-0 systemd[149955]: Closed D-Bus User Message Bus Socket.
Dec 01 09:23:22 compute-0 systemd[149955]: Stopped Create User's Volatile Files and Directories.
Dec 01 09:23:22 compute-0 systemd[149955]: Removed slice User Application Slice.
Dec 01 09:23:22 compute-0 systemd[149955]: Reached target Shutdown.
Dec 01 09:23:22 compute-0 systemd[149955]: Finished Exit the Session.
Dec 01 09:23:22 compute-0 systemd[149955]: Reached target Exit the Session.
Dec 01 09:23:22 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Dec 01 09:23:22 compute-0 systemd[1]: Stopped User Manager for UID 0.
Dec 01 09:23:22 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 01 09:23:22 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 01 09:23:22 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 01 09:23:22 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 01 09:23:22 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Dec 01 09:23:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v343: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:23 compute-0 sudo[150827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qupmxsdubefdezsqjyrifvmkkrdmlirp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581002.905897-34-59734882296207/AnsiballZ_file.py'
Dec 01 09:23:23 compute-0 sudo[150827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:23 compute-0 python3.9[150829]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:23 compute-0 sudo[150827]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:24 compute-0 sudo[150979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxbptkfwwlqkejfjugacynpcyzeulucg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581003.7217162-34-227720871110859/AnsiballZ_file.py'
Dec 01 09:23:24 compute-0 sudo[150979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:24 compute-0 python3.9[150981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:24 compute-0 sudo[150979]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:24 compute-0 ceph-mon[75031]: pgmap v343: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:24 compute-0 sudo[151131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcjfbdpdbcmlngpznxqpvknctswqzwns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581004.4030032-34-112791987221039/AnsiballZ_file.py'
Dec 01 09:23:24 compute-0 sudo[151131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:24 compute-0 python3.9[151133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:24 compute-0 sudo[151131]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v344: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:25 compute-0 sudo[151283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptflcfqicmgqzifucrfuhukdwwinfwgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581005.069993-34-136827940718269/AnsiballZ_file.py'
Dec 01 09:23:25 compute-0 sudo[151283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:25 compute-0 python3.9[151285]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:25 compute-0 sudo[151283]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:26 compute-0 sudo[151435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dftqsmevvndtdnbjlzebrzzpedkicmqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581005.6706715-34-118292862995625/AnsiballZ_file.py'
Dec 01 09:23:26 compute-0 sudo[151435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:26 compute-0 python3.9[151437]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:26 compute-0 sudo[151435]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:26 compute-0 ceph-mon[75031]: pgmap v344: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:26 compute-0 python3.9[151587]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:23:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v345: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:27 compute-0 sudo[151737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijofupqnssmfoqspyizunrkdrwkstntj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581007.1481047-78-133861889478713/AnsiballZ_seboolean.py'
Dec 01 09:23:27 compute-0 sudo[151737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:27 compute-0 python3.9[151739]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 01 09:23:28 compute-0 sudo[151737]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:28 compute-0 ceph-mon[75031]: pgmap v345: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v346: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:29 compute-0 python3.9[151889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:30 compute-0 python3.9[152010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581008.7121596-86-104930840889940/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:30 compute-0 python3.9[152160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:30 compute-0 ceph-mon[75031]: pgmap v346: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:31 compute-0 python3.9[152283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581010.2768936-101-206062927876463/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v347: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:31 compute-0 sudo[152433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctqabhrylohsbogejlnxlgwkufmsvwqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581011.5633616-118-191500756891737/AnsiballZ_setup.py'
Dec 01 09:23:31 compute-0 sudo[152433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:32 compute-0 python3.9[152435]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:23:32 compute-0 sudo[152433]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:32 compute-0 ceph-mon[75031]: pgmap v347: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:32 compute-0 sudo[152517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpqcugtbtsvppeyhqrgrvwydeshbprtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581011.5633616-118-191500756891737/AnsiballZ_dnf.py'
Dec 01 09:23:32 compute-0 sudo[152517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:33 compute-0 python3.9[152519]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:23:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v348: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:34 compute-0 sudo[152521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:34 compute-0 sudo[152521]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:34 compute-0 sudo[152521]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:34 compute-0 sudo[152546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:23:34 compute-0 sudo[152546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:34 compute-0 sudo[152546]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:34 compute-0 sudo[152517]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:34 compute-0 sudo[152571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:34 compute-0 sudo[152571]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:34 compute-0 sudo[152571]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:34 compute-0 sudo[152601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Dec 01 09:23:34 compute-0 sudo[152601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:34 compute-0 ceph-mon[75031]: pgmap v348: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:35 compute-0 sudo[152601]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:23:35 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:23:35 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v349: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:35 compute-0 sudo[152719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:35 compute-0 sudo[152719]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:35 compute-0 sudo[152719]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:35 compute-0 sudo[152744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:23:35 compute-0 sudo[152744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:35 compute-0 sudo[152744]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:35 compute-0 sudo[152792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:35 compute-0 sudo[152792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:35 compute-0 sudo[152792]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:35 compute-0 sudo[152822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:23:35 compute-0 sudo[152822]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:35 compute-0 sudo[152892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbvmhubsioqjqgqclhcufnnrjhsdosmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581014.9270875-130-129558888229991/AnsiballZ_systemd.py'
Dec 01 09:23:35 compute-0 sudo[152892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:35 compute-0 python3.9[152894]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:23:35 compute-0 sudo[152892]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:35 compute-0 sudo[152822]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:23:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:23:36 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:23:36 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:36 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 40d06d98-796c-49fc-9fd2-27cd21485fc6 does not exist
Dec 01 09:23:36 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev f75698da-039f-4aab-baf1-77d84153a9e3 does not exist
Dec 01 09:23:36 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev beaa4b04-d67c-4497-addc-7d613d264aaa does not exist
Dec 01 09:23:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:23:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:23:36 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:23:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:23:36 compute-0 sudo[152952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:36 compute-0 sudo[152952]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:36 compute-0 sudo[152952]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:36 compute-0 sudo[152995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:23:36 compute-0 sudo[152995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:36 compute-0 sudo[152995]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:36 compute-0 sudo[153044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:36 compute-0 sudo[153044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:36 compute-0 sudo[153044]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:36 compute-0 ceph-mon[75031]: pgmap v349: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:23:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:23:36 compute-0 sudo[153080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:23:36 compute-0 sudo[153080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:36 compute-0 python3.9[153187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:36 compute-0 podman[153214]: 2025-12-01 09:23:36.571694899 +0000 UTC m=+0.049794978 container create 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:23:36 compute-0 systemd[1]: Started libpod-conmon-64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52.scope.
Dec 01 09:23:36 compute-0 podman[153214]: 2025-12-01 09:23:36.543592898 +0000 UTC m=+0.021693007 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:23:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:23:36 compute-0 podman[153214]: 2025-12-01 09:23:36.662046577 +0000 UTC m=+0.140146686 container init 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:23:36 compute-0 podman[153214]: 2025-12-01 09:23:36.670523932 +0000 UTC m=+0.148624021 container start 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:23:36 compute-0 podman[153214]: 2025-12-01 09:23:36.673967131 +0000 UTC m=+0.152067240 container attach 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:23:36 compute-0 hardcore_dewdney[153249]: 167 167
Dec 01 09:23:36 compute-0 systemd[1]: libpod-64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52.scope: Deactivated successfully.
Dec 01 09:23:36 compute-0 podman[153214]: 2025-12-01 09:23:36.676492514 +0000 UTC m=+0.154592623 container died 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec 01 09:23:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-5aa925afd2dc71c7e1e1d3345230026e390cc158cc2b95cab3f8701cefb78b08-merged.mount: Deactivated successfully.
Dec 01 09:23:36 compute-0 podman[153214]: 2025-12-01 09:23:36.717141457 +0000 UTC m=+0.195241576 container remove 64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_dewdney, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:23:36 compute-0 systemd[1]: libpod-conmon-64a2d8408fa67e7433f681b8053724cff6e6ff7f1b8eecf4b4d411f64115ae52.scope: Deactivated successfully.
Dec 01 09:23:36 compute-0 podman[153359]: 2025-12-01 09:23:36.952784068 +0000 UTC m=+0.068242191 container create bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:23:37 compute-0 systemd[1]: Started libpod-conmon-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope.
Dec 01 09:23:37 compute-0 podman[153359]: 2025-12-01 09:23:36.9293072 +0000 UTC m=+0.044765353 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:23:37 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:23:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:37 compute-0 podman[153359]: 2025-12-01 09:23:37.078390303 +0000 UTC m=+0.193848426 container init bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:23:37 compute-0 podman[153359]: 2025-12-01 09:23:37.091193253 +0000 UTC m=+0.206651366 container start bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:23:37 compute-0 python3.9[153380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581016.0948708-138-134806601772212/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:37 compute-0 podman[153359]: 2025-12-01 09:23:37.105516666 +0000 UTC m=+0.220974799 container attach bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:23:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v350: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:37 compute-0 python3.9[153545]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:38 compute-0 modest_khayyam[153391]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:23:38 compute-0 modest_khayyam[153391]: --> relative data size: 1.0
Dec 01 09:23:38 compute-0 modest_khayyam[153391]: --> All data devices are unavailable
Dec 01 09:23:38 compute-0 systemd[1]: libpod-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope: Deactivated successfully.
Dec 01 09:23:38 compute-0 systemd[1]: libpod-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope: Consumed 1.039s CPU time.
Dec 01 09:23:38 compute-0 podman[153359]: 2025-12-01 09:23:38.211879587 +0000 UTC m=+1.327337740 container died bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:23:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-56716004b9a1a38bc7c592e47420fdb241617eb95197c22e990356941b598fd1-merged.mount: Deactivated successfully.
Dec 01 09:23:38 compute-0 python3.9[153686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581017.2597456-138-256856895410513/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:38 compute-0 podman[153359]: 2025-12-01 09:23:38.278117228 +0000 UTC m=+1.393575341 container remove bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_khayyam, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:23:38 compute-0 systemd[1]: libpod-conmon-bdb03a90fdf615ca33397291f3b67f2b5b2020c1659d5803f80f98a8167edee8.scope: Deactivated successfully.
Dec 01 09:23:38 compute-0 sudo[153080]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:38 compute-0 ceph-mon[75031]: pgmap v350: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:38 compute-0 sudo[153710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:38 compute-0 sudo[153710]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:38 compute-0 sudo[153710]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:38 compute-0 sudo[153751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:23:38 compute-0 sudo[153751]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:38 compute-0 sudo[153751]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:38 compute-0 sudo[153776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:38 compute-0 sudo[153776]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:38 compute-0 sudo[153776]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:38 compute-0 sudo[153801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:23:38 compute-0 sudo[153801]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:38 compute-0 podman[153868]: 2025-12-01 09:23:38.879217547 +0000 UTC m=+0.061142376 container create 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:23:38 compute-0 podman[153868]: 2025-12-01 09:23:38.839929293 +0000 UTC m=+0.021854152 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:23:38 compute-0 systemd[1]: Started libpod-conmon-1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3.scope.
Dec 01 09:23:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:23:39 compute-0 podman[153868]: 2025-12-01 09:23:39.066558834 +0000 UTC m=+0.248483683 container init 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:23:39 compute-0 podman[153868]: 2025-12-01 09:23:39.073195155 +0000 UTC m=+0.255119984 container start 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:23:39 compute-0 crazy_hellman[153884]: 167 167
Dec 01 09:23:39 compute-0 systemd[1]: libpod-1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3.scope: Deactivated successfully.
Dec 01 09:23:39 compute-0 podman[153868]: 2025-12-01 09:23:39.093069679 +0000 UTC m=+0.274994538 container attach 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:23:39 compute-0 podman[153868]: 2025-12-01 09:23:39.093655816 +0000 UTC m=+0.275580645 container died 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:23:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-71dedbc496ca1be46ebe106c3d7fc491d3c73a349847d341768f93fcd372dbd2-merged.mount: Deactivated successfully.
Dec 01 09:23:39 compute-0 podman[153868]: 2025-12-01 09:23:39.244370815 +0000 UTC m=+0.426295644 container remove 1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_hellman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:23:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v351: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:39 compute-0 systemd[1]: libpod-conmon-1b5729f4dae414ac6fe3e546464332893ceb9c2c6e793545cb4d4265bd4f62b3.scope: Deactivated successfully.
Dec 01 09:23:39 compute-0 python3.9[154027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:39 compute-0 podman[154033]: 2025-12-01 09:23:39.453717437 +0000 UTC m=+0.095380544 container create 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:23:39 compute-0 podman[154033]: 2025-12-01 09:23:39.380905975 +0000 UTC m=+0.022569102 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:23:39 compute-0 systemd[1]: Started libpod-conmon-404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae.scope.
Dec 01 09:23:39 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:23:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:39 compute-0 podman[154033]: 2025-12-01 09:23:39.825944969 +0000 UTC m=+0.467608086 container init 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:23:39 compute-0 podman[154033]: 2025-12-01 09:23:39.833397125 +0000 UTC m=+0.475060232 container start 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:23:39 compute-0 podman[154033]: 2025-12-01 09:23:39.837669788 +0000 UTC m=+0.479332895 container attach 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:23:39 compute-0 python3.9[154173]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581019.0056908-182-248107640304222/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]: {
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:     "0": [
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:         {
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "devices": [
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "/dev/loop3"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             ],
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_name": "ceph_lv0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_size": "21470642176",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "name": "ceph_lv0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "tags": {
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cluster_name": "ceph",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.crush_device_class": "",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.encrypted": "0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osd_id": "0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.type": "block",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.vdo": "0"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             },
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "type": "block",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "vg_name": "ceph_vg0"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:         }
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:     ],
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:     "1": [
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:         {
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "devices": [
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "/dev/loop4"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             ],
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_name": "ceph_lv1",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_size": "21470642176",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "name": "ceph_lv1",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "tags": {
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cluster_name": "ceph",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.crush_device_class": "",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.encrypted": "0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osd_id": "1",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.type": "block",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.vdo": "0"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             },
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "type": "block",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "vg_name": "ceph_vg1"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:         }
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:     ],
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:     "2": [
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:         {
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "devices": [
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "/dev/loop5"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             ],
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_name": "ceph_lv2",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_size": "21470642176",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "name": "ceph_lv2",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "tags": {
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.cluster_name": "ceph",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.crush_device_class": "",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.encrypted": "0",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osd_id": "2",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.type": "block",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:                 "ceph.vdo": "0"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             },
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "type": "block",
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:             "vg_name": "ceph_vg2"
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:         }
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]:     ]
Dec 01 09:23:40 compute-0 vigorous_poincare[154120]: }
Dec 01 09:23:40 compute-0 systemd[1]: libpod-404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae.scope: Deactivated successfully.
Dec 01 09:23:40 compute-0 podman[154033]: 2025-12-01 09:23:40.603693549 +0000 UTC m=+1.245356686 container died 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:23:40 compute-0 ceph-mon[75031]: pgmap v351: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-a3fcba4984687e969443dd1741f4d0391e1c37e3f3546a4dcfd07596d40f5984-merged.mount: Deactivated successfully.
Dec 01 09:23:41 compute-0 podman[154033]: 2025-12-01 09:23:41.07344179 +0000 UTC m=+1.715104887 container remove 404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_poincare, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:23:41 compute-0 python3.9[154325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:41 compute-0 sudo[153801]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:41 compute-0 sudo[154344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:41 compute-0 sudo[154344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:41 compute-0 sudo[154344]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:41 compute-0 systemd[1]: libpod-conmon-404665cfbbf2f27da9c0b08ca6be82d54f56756202371d8f9fdfeac6640f9cae.scope: Deactivated successfully.
Dec 01 09:23:41 compute-0 sudo[154392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:23:41 compute-0 sudo[154392]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:41 compute-0 sudo[154392]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:41 compute-0 sudo[154440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:41 compute-0 sudo[154440]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:41 compute-0 sudo[154440]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v352: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:41 compute-0 sudo[154487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:23:41 compute-0 sudo[154487]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:41 compute-0 python3.9[154564]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581020.1299899-182-187927865733140/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:41 compute-0 podman[154605]: 2025-12-01 09:23:41.629402425 +0000 UTC m=+0.044355734 container create 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:23:41 compute-0 systemd[1]: Started libpod-conmon-9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90.scope.
Dec 01 09:23:41 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:23:41 compute-0 podman[154605]: 2025-12-01 09:23:41.605890151 +0000 UTC m=+0.020843480 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:23:41 compute-0 podman[154605]: 2025-12-01 09:23:41.71595213 +0000 UTC m=+0.130905449 container init 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:23:41 compute-0 podman[154605]: 2025-12-01 09:23:41.725097709 +0000 UTC m=+0.140051018 container start 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:23:41 compute-0 podman[154605]: 2025-12-01 09:23:41.729245156 +0000 UTC m=+0.144198555 container attach 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:23:41 compute-0 admiring_heyrovsky[154644]: 167 167
Dec 01 09:23:41 compute-0 systemd[1]: libpod-9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90.scope: Deactivated successfully.
Dec 01 09:23:41 compute-0 podman[154605]: 2025-12-01 09:23:41.736672516 +0000 UTC m=+0.151625845 container died 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:23:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-c08938f199fd711aad54d76a27632c7339101eeb290083492db8abadd11597a5-merged.mount: Deactivated successfully.
Dec 01 09:23:41 compute-0 podman[154605]: 2025-12-01 09:23:41.773940498 +0000 UTC m=+0.188893807 container remove 9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:23:41 compute-0 systemd[1]: libpod-conmon-9be607d2bc2f5c8e97ce7c9739b7758ef08a5cb86b2bd4663b524dcd0fe8bd90.scope: Deactivated successfully.
Dec 01 09:23:41 compute-0 podman[154721]: 2025-12-01 09:23:41.944675792 +0000 UTC m=+0.050001404 container create 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:23:42 compute-0 podman[154721]: 2025-12-01 09:23:41.922204767 +0000 UTC m=+0.027530399 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:23:42 compute-0 systemd[1]: Started libpod-conmon-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope.
Dec 01 09:23:42 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:23:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:23:42 compute-0 podman[154721]: 2025-12-01 09:23:42.216310185 +0000 UTC m=+0.321635827 container init 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec 01 09:23:42 compute-0 podman[154721]: 2025-12-01 09:23:42.228140509 +0000 UTC m=+0.333466121 container start 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:23:42 compute-0 podman[154721]: 2025-12-01 09:23:42.232217314 +0000 UTC m=+0.337542926 container attach 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:23:42 compute-0 python3.9[154807]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:23:42 compute-0 ceph-mon[75031]: pgmap v352: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:42 compute-0 sudo[154978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytbjhczfimvhbadgyetayfdhggehzxkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581022.5731735-220-98010084780685/AnsiballZ_file.py'
Dec 01 09:23:42 compute-0 sudo[154978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:42 compute-0 ovn_controller[149914]: 2025-12-01T09:23:42Z|00025|memory|INFO|17664 kB peak resident set size after 30.1 seconds
Dec 01 09:23:42 compute-0 ovn_controller[149914]: 2025-12-01T09:23:42Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec 01 09:23:42 compute-0 podman[154940]: 2025-12-01 09:23:42.915008302 +0000 UTC m=+0.094915192 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:23:43 compute-0 python3.9[154989]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:23:43 compute-0 sudo[154978]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:43 compute-0 gracious_liskov[154810]: {
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "osd_id": 0,
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "type": "bluestore"
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:     },
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "osd_id": 1,
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "type": "bluestore"
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:     },
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "osd_id": 2,
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:         "type": "bluestore"
Dec 01 09:23:43 compute-0 gracious_liskov[154810]:     }
Dec 01 09:23:43 compute-0 gracious_liskov[154810]: }
Dec 01 09:23:43 compute-0 systemd[1]: libpod-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope: Deactivated successfully.
Dec 01 09:23:43 compute-0 systemd[1]: libpod-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope: Consumed 1.034s CPU time.
Dec 01 09:23:43 compute-0 podman[154721]: 2025-12-01 09:23:43.258853705 +0000 UTC m=+1.364179337 container died 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v353: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-e272aaada3acd3a7ee19a0c40cdf63e9b3d03732d7313aeb9d26a4a90ac7a0ce-merged.mount: Deactivated successfully.
Dec 01 09:23:43 compute-0 podman[154721]: 2025-12-01 09:23:43.316041821 +0000 UTC m=+1.421367423 container remove 887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_liskov, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:23:43 compute-0 systemd[1]: libpod-conmon-887826374d74feaa8a839731f4aefe83ee03e9d69550fa1a1c3f4dde835bbaa5.scope: Deactivated successfully.
Dec 01 09:23:43 compute-0 sudo[154487]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:23:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:23:43 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:43 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 0c76680d-f4e5-4ca4-8d79-eafe7aee3f0a does not exist
Dec 01 09:23:43 compute-0 sudo[155135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:23:43 compute-0 sudo[155135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:43 compute-0 sudo[155135]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:43 compute-0 sudo[155172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:23:43 compute-0 sudo[155172]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:23:43 compute-0 sudo[155172]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:43 compute-0 sudo[155235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndxgodmbchrzimxxutahwbnfcvjfrodr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581023.2345161-228-26270533193663/AnsiballZ_stat.py'
Dec 01 09:23:43 compute-0 sudo[155235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:43 compute-0 python3.9[155237]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:43 compute-0 sudo[155235]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:43 compute-0 sudo[155313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plxmtwhkwpznhvrlqumebjrvziwdredd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581023.2345161-228-26270533193663/AnsiballZ_file.py'
Dec 01 09:23:43 compute-0 sudo[155313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:44 compute-0 python3.9[155315]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:44 compute-0 sudo[155313]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:44 compute-0 ceph-mon[75031]: pgmap v353: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:44 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:44 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:23:44 compute-0 sudo[155465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzyygbcyfykajjzvrekllozhwfplvilo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581024.328952-228-39582284860780/AnsiballZ_stat.py'
Dec 01 09:23:44 compute-0 sudo[155465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:44 compute-0 python3.9[155467]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:44 compute-0 sudo[155465]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:45 compute-0 sudo[155543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwoebcygfbpwwzrpqmfbbgdbopaateei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581024.328952-228-39582284860780/AnsiballZ_file.py'
Dec 01 09:23:45 compute-0 sudo[155543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v354: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:45 compute-0 python3.9[155545]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:45 compute-0 sudo[155543]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:45 compute-0 sudo[155695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbfrfwyumtflhnmmtilgshlhdazlvmld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581025.4818451-251-161353928501735/AnsiballZ_file.py'
Dec 01 09:23:45 compute-0 sudo[155695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:45 compute-0 python3.9[155697]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:46 compute-0 sudo[155695]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:46 compute-0 sudo[155847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qixzucglvishqrkpsqtwpnbhnerqdthu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581026.1866417-259-264967810303806/AnsiballZ_stat.py'
Dec 01 09:23:46 compute-0 sudo[155847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:46 compute-0 ceph-mon[75031]: pgmap v354: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:46 compute-0 python3.9[155849]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:46 compute-0 sudo[155847]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:46 compute-0 sudo[155925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytpyudcpxgdivbrrqwieltevuqdzedmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581026.1866417-259-264967810303806/AnsiballZ_file.py'
Dec 01 09:23:46 compute-0 sudo[155925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:47 compute-0 python3.9[155927]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:47 compute-0 sudo[155925]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v355: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:47 compute-0 sudo[156077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxyyxwdnmgyuchnjisxzzcyvxmmamjhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581027.4921374-271-17265756849361/AnsiballZ_stat.py'
Dec 01 09:23:47 compute-0 sudo[156077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:47 compute-0 python3.9[156079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:48 compute-0 sudo[156077]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:48 compute-0 sudo[156155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwyxmequwwscsgoasroqirrbaxirvujw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581027.4921374-271-17265756849361/AnsiballZ_file.py'
Dec 01 09:23:48 compute-0 sudo[156155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:48 compute-0 python3.9[156157]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:48 compute-0 sudo[156155]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:48 compute-0 ceph-mon[75031]: pgmap v355: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:48 compute-0 sudo[156307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehpqwsphbrfqhjdqjhapfpelujpjxexn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581028.6631427-283-203890677300153/AnsiballZ_systemd.py'
Dec 01 09:23:48 compute-0 sudo[156307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v356: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:49 compute-0 python3.9[156309]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:23:49 compute-0 systemd[1]: Reloading.
Dec 01 09:23:49 compute-0 systemd-rc-local-generator[156333]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:23:49 compute-0 systemd-sysv-generator[156337]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:23:49 compute-0 sudo[156307]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:50 compute-0 sudo[156496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drvganjgjlknyvnnrfhzohqmoxezibvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581029.8051744-291-229091211912453/AnsiballZ_stat.py'
Dec 01 09:23:50 compute-0 sudo[156496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:50 compute-0 python3.9[156498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:50 compute-0 sudo[156496]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:50 compute-0 sudo[156574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pamelplscmxhwvwbqbjhyncxzlkewovw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581029.8051744-291-229091211912453/AnsiballZ_file.py'
Dec 01 09:23:50 compute-0 sudo[156574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:50 compute-0 python3.9[156576]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:50 compute-0 sudo[156574]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v357: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:51 compute-0 sudo[156726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpfpzgczxogtouvgkzkistlyxwoytgow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581030.965847-303-22162038648317/AnsiballZ_stat.py'
Dec 01 09:23:51 compute-0 sudo[156726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:51 compute-0 ceph-mon[75031]: pgmap v356: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:51 compute-0 python3.9[156728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:51 compute-0 sudo[156726]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:51 compute-0 sudo[156804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guzwsbefekphsemmfuunvaizmujbgjci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581030.965847-303-22162038648317/AnsiballZ_file.py'
Dec 01 09:23:51 compute-0 sudo[156804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:51 compute-0 python3.9[156806]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:52 compute-0 sudo[156804]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:52 compute-0 ceph-mon[75031]: pgmap v357: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:52 compute-0 sudo[156956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcrzgyfcjqdawskmgwunlyozqpkgpwcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581032.1551957-315-181934714136278/AnsiballZ_systemd.py'
Dec 01 09:23:52 compute-0 sudo[156956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:52 compute-0 python3.9[156958]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:23:52 compute-0 systemd[1]: Reloading.
Dec 01 09:23:53 compute-0 systemd-rc-local-generator[156984]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:23:53 compute-0 systemd-sysv-generator[156987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:23:53 compute-0 systemd[1]: Starting Create netns directory...
Dec 01 09:23:53 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:23:53 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:23:53 compute-0 systemd[1]: Finished Create netns directory.
Dec 01 09:23:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v358: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:53 compute-0 sudo[156956]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:53 compute-0 sudo[157149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqzsnkvyxnunylszdjdgyraepachojhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581033.562704-325-46241731493818/AnsiballZ_file.py'
Dec 01 09:23:53 compute-0 sudo[157149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:54 compute-0 python3.9[157151]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:54 compute-0 sudo[157149]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:54 compute-0 ceph-mon[75031]: pgmap v358: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:54 compute-0 sudo[157301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtjboclonmwwlgajchoozmwtyogzwcpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581034.3100777-333-154824419591383/AnsiballZ_stat.py'
Dec 01 09:23:54 compute-0 sudo[157301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:54 compute-0 python3.9[157303]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:54 compute-0 sudo[157301]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:55 compute-0 sudo[157424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjtftyfkknbzvczokpegzabswucbxwsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581034.3100777-333-154824419591383/AnsiballZ_copy.py'
Dec 01 09:23:55 compute-0 sudo[157424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v359: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:55 compute-0 python3.9[157426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581034.3100777-333-154824419591383/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:55 compute-0 sudo[157424]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:55 compute-0 sudo[157576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdzvjnvvlalyfjocvlhiqnyxvrwmvik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581035.7036705-350-209195543647262/AnsiballZ_file.py'
Dec 01 09:23:55 compute-0 sudo[157576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:56 compute-0 python3.9[157578]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:23:56 compute-0 sudo[157576]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:56 compute-0 ceph-mon[75031]: pgmap v359: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:56 compute-0 sudo[157728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znuypkghysxukkcpxcybklekzljufzia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581036.627149-358-267172713260222/AnsiballZ_stat.py'
Dec 01 09:23:56 compute-0 sudo[157728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:57 compute-0 python3.9[157730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:23:57 compute-0 sudo[157728]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v360: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:57 compute-0 sudo[157851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhtazidogogaobjaisqbhkdaoqmgluaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581036.627149-358-267172713260222/AnsiballZ_copy.py'
Dec 01 09:23:57 compute-0 sudo[157851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:57 compute-0 python3.9[157853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581036.627149-358-267172713260222/.source.json _original_basename=._8ma83r7 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:57 compute-0 sudo[157851]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:58 compute-0 sudo[158003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhntnjgatecasasgyxtsiojddfmflwdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581037.9983027-373-172936450973834/AnsiballZ_file.py'
Dec 01 09:23:58 compute-0 sudo[158003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:58 compute-0 ceph-mon[75031]: pgmap v360: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:23:58 compute-0 python3.9[158005]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:23:58 compute-0 sudo[158003]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:59 compute-0 sudo[158155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdwurpgueczpeplwdnepedhawciudqzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581038.826192-381-113034806303051/AnsiballZ_stat.py'
Dec 01 09:23:59 compute-0 sudo[158155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v361: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:23:59 compute-0 sudo[158155]: pam_unix(sudo:session): session closed for user root
Dec 01 09:23:59 compute-0 sudo[158278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdzdvullgfsujhhwjqzpnydybhihwtxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581038.826192-381-113034806303051/AnsiballZ_copy.py'
Dec 01 09:23:59 compute-0 sudo[158278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:23:59 compute-0 sudo[158278]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:00 compute-0 ceph-mon[75031]: pgmap v361: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:00 compute-0 sudo[158430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsfedbfarikkogsxalnjywnvtjhnamvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581040.2091231-398-263264519013179/AnsiballZ_container_config_data.py'
Dec 01 09:24:00 compute-0 sudo[158430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:00 compute-0 python3.9[158432]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 01 09:24:00 compute-0 sudo[158430]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v362: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:01 compute-0 sudo[158582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovfkrnanauleehjmbqvsejznrnywllew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581041.071098-407-191554342191709/AnsiballZ_container_config_hash.py'
Dec 01 09:24:01 compute-0 sudo[158582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:01 compute-0 python3.9[158584]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 09:24:01 compute-0 sudo[158582]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:02 compute-0 sudo[158734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkxyjeqqcgewblzzbwczztsltaqsxdyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581042.0831277-416-183764334170437/AnsiballZ_podman_container_info.py'
Dec 01 09:24:02 compute-0 sudo[158734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:02 compute-0 ceph-mon[75031]: pgmap v362: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:02 compute-0 python3.9[158736]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 01 09:24:03 compute-0 sudo[158734]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v363: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:04 compute-0 sudo[158912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxznjlgadauhlbidakixezqhxvyazdjc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764581043.6441617-429-20868200179579/AnsiballZ_edpm_container_manage.py'
Dec 01 09:24:04 compute-0 sudo[158912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:04 compute-0 python3[158914]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 09:24:04 compute-0 ceph-mon[75031]: pgmap v363: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v364: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:06 compute-0 ceph-mon[75031]: pgmap v364: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v365: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:08 compute-0 ceph-mon[75031]: pgmap v365: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v366: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:10 compute-0 ceph-mon[75031]: pgmap v366: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:24:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 15.93 MB, 0.03 MB/s
                                           Interval WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:24:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v367: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:12 compute-0 ceph-mon[75031]: pgmap v367: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:24:12
Dec 01 09:24:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:24:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:24:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'vms', '.mgr', 'images', 'volumes']
Dec 01 09:24:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:24:13 compute-0 podman[158929]: 2025-12-01 09:24:13.181098474 +0000 UTC m=+8.647860806 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 01 09:24:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v368: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:13 compute-0 podman[159055]: 2025-12-01 09:24:13.360392029 +0000 UTC m=+0.055227171 container create 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 01 09:24:13 compute-0 podman[159055]: 2025-12-01 09:24:13.331017869 +0000 UTC m=+0.025853021 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 01 09:24:13 compute-0 python3[158914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 01 09:24:13 compute-0 sudo[158912]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:13 compute-0 sudo[159259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kovxdkotxwpzvgrkvzlapsunjlpifnyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581053.6803956-437-93744392750846/AnsiballZ_stat.py'
Dec 01 09:24:13 compute-0 sudo[159259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:13 compute-0 podman[159212]: 2025-12-01 09:24:13.996207401 +0000 UTC m=+0.094624204 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 09:24:14 compute-0 python3.9[159264]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:24:14 compute-0 sudo[159259]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:14 compute-0 ceph-mon[75031]: pgmap v368: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:14 compute-0 sudo[159422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdpiisshfqgllwmtrsonnyytchijunjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581054.4255009-446-35225030461490/AnsiballZ_file.py'
Dec 01 09:24:14 compute-0 sudo[159422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:14 compute-0 python3.9[159424]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:14 compute-0 sudo[159422]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:15 compute-0 sudo[159498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecgbfvcddwemddyxpidnvacgbcquiygc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581054.4255009-446-35225030461490/AnsiballZ_stat.py'
Dec 01 09:24:15 compute-0 sudo[159498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v369: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:15 compute-0 python3.9[159500]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:24:15 compute-0 sudo[159498]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:24:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 16.04 MB, 0.03 MB/s
                                           Interval WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:24:16 compute-0 sudo[159649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytgloagdozicmycouxtdudleymybletl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581055.5203018-446-273642022239231/AnsiballZ_copy.py'
Dec 01 09:24:16 compute-0 sudo[159649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:16 compute-0 python3.9[159651]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764581055.5203018-446-273642022239231/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:16 compute-0 sudo[159649]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:16 compute-0 sudo[159725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mprkvtarknyazbailmddslsfnoxzvner ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581055.5203018-446-273642022239231/AnsiballZ_systemd.py'
Dec 01 09:24:16 compute-0 sudo[159725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:16 compute-0 python3.9[159727]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:24:16 compute-0 ceph-mon[75031]: pgmap v369: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:16 compute-0 systemd[1]: Reloading.
Dec 01 09:24:16 compute-0 systemd-rc-local-generator[159749]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:24:16 compute-0 systemd-sysv-generator[159755]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:24:17 compute-0 sudo[159725]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v370: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:17 compute-0 sudo[159835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gygifklyahpejlmrhsrbktmdgnzvqoei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581055.5203018-446-273642022239231/AnsiballZ_systemd.py'
Dec 01 09:24:17 compute-0 sudo[159835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:17 compute-0 python3.9[159837]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:17 compute-0 systemd[1]: Reloading.
Dec 01 09:24:17 compute-0 systemd-rc-local-generator[159867]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:24:17 compute-0 systemd-sysv-generator[159871]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:24:18 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Dec 01 09:24:18 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:24:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c2505faea1fa7dfc6f89e3da507131bc1f3625ef975dfc2e3193dde241ea379/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c2505faea1fa7dfc6f89e3da507131bc1f3625ef975dfc2e3193dde241ea379/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:18 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba.
Dec 01 09:24:18 compute-0 podman[159878]: 2025-12-01 09:24:18.43488009 +0000 UTC m=+0.377682081 container init 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + sudo -E kolla_set_configs
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:24:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:24:18 compute-0 podman[159878]: 2025-12-01 09:24:18.465750552 +0000 UTC m=+0.408552443 container start 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 09:24:18 compute-0 edpm-start-podman-container[159878]: ovn_metadata_agent
Dec 01 09:24:18 compute-0 edpm-start-podman-container[159877]: Creating additional drop-in dependency for "ovn_metadata_agent" (195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba)
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Validating config file
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Copying service configuration files
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Writing out command to execute
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: ++ cat /run_command
Dec 01 09:24:18 compute-0 systemd[1]: Reloading.
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + CMD=neutron-ovn-metadata-agent
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + ARGS=
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + sudo kolla_copy_cacerts
Dec 01 09:24:18 compute-0 podman[159900]: 2025-12-01 09:24:18.560309973 +0000 UTC m=+0.082101700 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 01 09:24:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + [[ ! -n '' ]]
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + . kolla_extend_start
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: Running command: 'neutron-ovn-metadata-agent'
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + umask 0022
Dec 01 09:24:18 compute-0 ovn_metadata_agent[159893]: + exec neutron-ovn-metadata-agent
Dec 01 09:24:18 compute-0 systemd-sysv-generator[159974]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:24:18 compute-0 systemd-rc-local-generator[159970]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:24:18 compute-0 ceph-mon[75031]: pgmap v370: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:18 compute-0 systemd[1]: Started ovn_metadata_agent container.
Dec 01 09:24:18 compute-0 sudo[159835]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v371: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:19 compute-0 sshd-session[150520]: Connection closed by 192.168.122.30 port 38336
Dec 01 09:24:19 compute-0 sshd-session[150517]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:24:19 compute-0 systemd[1]: session-48.scope: Deactivated successfully.
Dec 01 09:24:19 compute-0 systemd[1]: session-48.scope: Consumed 57.378s CPU time.
Dec 01 09:24:19 compute-0 systemd-logind[788]: Session 48 logged out. Waiting for processes to exit.
Dec 01 09:24:19 compute-0 systemd-logind[788]: Removed session 48.
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.418 159899 INFO neutron.common.config [-] Logging enabled!
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.418 159899 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.418 159899 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.419 159899 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.420 159899 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.421 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.422 159899 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.423 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.424 159899 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.425 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.426 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.427 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.428 159899 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.429 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.430 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.431 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.432 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.433 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.434 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.435 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.436 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.437 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.438 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.439 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.440 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.441 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.442 159899 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.443 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.444 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.445 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.446 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.447 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.448 159899 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.449 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.450 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.451 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.452 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.453 159899 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.462 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.475 159899 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name a8013a17-6378-4c2f-a5de-9d3b29c7a42e (UUID: a8013a17-6378-4c2f-a5de-9d3b29c7a42e) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.505 159899 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.510 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.516 159899 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.523 159899 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'a8013a17-6378-4c2f-a5de-9d3b29c7a42e'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7ff6ad027af0>], external_ids={}, name=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, nb_cfg_timestamp=1764581000829, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.524 159899 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7ff6ad02ab20>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.525 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.525 159899 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.526 159899 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.526 159899 INFO oslo_service.service [-] Starting 1 workers
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.533 159899 DEBUG oslo_service.service [-] Started child 160008 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.537 159899 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpdz57tbq9/privsep.sock']
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.537 160008 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-166554'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.562 160008 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.562 160008 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.562 160008 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.569 160008 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.576 160008 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 01 09:24:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:20.582 160008 INFO eventlet.wsgi.server [-] (160008) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 01 09:24:20 compute-0 ceph-mon[75031]: pgmap v371: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:21 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.259 159899 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.260 159899 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpdz57tbq9/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.144 160013 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.149 160013 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.152 160013 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.152 160013 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160013
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.263 160013 DEBUG oslo.privsep.daemon [-] privsep: reply[19cd340e-88d2-4d9a-886a-294f0436eae8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 09:24:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v372: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.792 160013 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.792 160013 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:24:21 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:21.793 160013 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.410 160013 DEBUG oslo.privsep.daemon [-] privsep: reply[f673a4e7-3009-4b27-8769-e67efc72cbca]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.413 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, column=external_ids, values=({'neutron:ovn-metadata-id': '7ca6876a-62db-5b7a-a446-404679c57fc8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.428 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.435 159899 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.436 159899 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.437 159899 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.437 159899 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.437 159899 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.438 159899 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.438 159899 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.438 159899 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.439 159899 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.439 159899 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.439 159899 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.440 159899 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.440 159899 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.440 159899 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.441 159899 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.441 159899 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.441 159899 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.442 159899 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.442 159899 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.442 159899 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.443 159899 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.443 159899 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.443 159899 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.444 159899 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.444 159899 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.444 159899 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.445 159899 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.445 159899 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.445 159899 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.446 159899 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.446 159899 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.446 159899 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.447 159899 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.447 159899 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.447 159899 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.448 159899 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.448 159899 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.448 159899 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.449 159899 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.449 159899 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.449 159899 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.450 159899 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.451 159899 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.452 159899 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.453 159899 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.454 159899 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.455 159899 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.456 159899 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.457 159899 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.458 159899 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.459 159899 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.460 159899 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.461 159899 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.462 159899 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.463 159899 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.464 159899 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.465 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.466 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.467 159899 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.468 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.469 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.470 159899 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.471 159899 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.472 159899 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.473 159899 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.474 159899 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.475 159899 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.476 159899 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.477 159899 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.478 159899 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.479 159899 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.480 159899 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.481 159899 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.482 159899 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.483 159899 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.484 159899 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.485 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.486 159899 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.487 159899 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.488 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.489 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.490 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.491 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:24:22 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:24:22.492 159899 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 09:24:22 compute-0 ceph-mon[75031]: pgmap v372: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v373: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:24:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 15.87 MB, 0.03 MB/s
                                           Interval WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:24:24 compute-0 ceph-mon[75031]: pgmap v373: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v374: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:25 compute-0 sshd-session[160018]: Accepted publickey for zuul from 192.168.122.30 port 41430 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:24:25 compute-0 systemd-logind[788]: New session 49 of user zuul.
Dec 01 09:24:25 compute-0 systemd[1]: Started Session 49 of User zuul.
Dec 01 09:24:25 compute-0 sshd-session[160018]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:24:26 compute-0 python3.9[160172]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:24:26 compute-0 ceph-mon[75031]: pgmap v374: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:27 compute-0 sshd-session[160074]: Connection closed by 106.75.87.177 port 58722
Dec 01 09:24:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v375: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:27 compute-0 sudo[160328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aguyfgqrusyawaqcsuqfvbduiznuywbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581067.0531836-34-191473624537046/AnsiballZ_command.py'
Dec 01 09:24:27 compute-0 sudo[160328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:27 compute-0 python3.9[160330]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:27 compute-0 sudo[160328]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:28 compute-0 ceph-mgr[75324]: [devicehealth INFO root] Check health
Dec 01 09:24:29 compute-0 ceph-mon[75031]: pgmap v375: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:29 compute-0 sudo[160491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrlrronhkayezexyavqaxskkxdlevhkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581068.2642176-45-42881966291734/AnsiballZ_systemd_service.py'
Dec 01 09:24:29 compute-0 sudo[160491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v376: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:29 compute-0 python3.9[160493]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:24:29 compute-0 systemd[1]: Reloading.
Dec 01 09:24:29 compute-0 systemd-rc-local-generator[160514]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:24:29 compute-0 systemd-sysv-generator[160521]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:24:29 compute-0 sudo[160491]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:30 compute-0 python3.9[160679]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:24:30 compute-0 network[160696]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:24:30 compute-0 network[160697]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:24:30 compute-0 network[160698]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:24:31 compute-0 ceph-mon[75031]: pgmap v376: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v377: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:33 compute-0 ceph-mon[75031]: pgmap v377: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v378: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:35 compute-0 ceph-mon[75031]: pgmap v378: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v379: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:35 compute-0 sudo[160958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjwpiwjtdpzngnsjpbrjorsarzatquup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581075.1044793-64-6549539734459/AnsiballZ_systemd_service.py'
Dec 01 09:24:35 compute-0 sudo[160958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:35 compute-0 python3.9[160960]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:35 compute-0 sudo[160958]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:36 compute-0 sudo[161111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szvvzcqhhkzpifeujhmqnepmxxvhnjzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581075.9618433-64-174643344807477/AnsiballZ_systemd_service.py'
Dec 01 09:24:36 compute-0 sudo[161111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:36 compute-0 python3.9[161113]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:36 compute-0 sudo[161111]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:37 compute-0 sudo[161264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xascysiwnpodakohougpqlfsvchgvopx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581076.7304761-64-238646505759532/AnsiballZ_systemd_service.py'
Dec 01 09:24:37 compute-0 sudo[161264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:37 compute-0 ceph-mon[75031]: pgmap v379: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v380: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:37 compute-0 python3.9[161266]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:37 compute-0 sudo[161264]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:37 compute-0 sudo[161417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lligtphajdafcjyxnppwwgsojzzelvlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581077.5294566-64-203645012161772/AnsiballZ_systemd_service.py'
Dec 01 09:24:37 compute-0 sudo[161417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:38 compute-0 python3.9[161419]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:38 compute-0 sudo[161417]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:38 compute-0 sudo[161570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhdnpojgecthcmatsdwqdjymhzegjkbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581078.3077002-64-198380889346914/AnsiballZ_systemd_service.py'
Dec 01 09:24:38 compute-0 sudo[161570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:38 compute-0 python3.9[161572]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:38 compute-0 sudo[161570]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v381: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:39 compute-0 ceph-mon[75031]: pgmap v380: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:39 compute-0 sudo[161723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzxxuxcwfmarxfgnbxirfkbveepeqinj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581079.1502817-64-14103456745349/AnsiballZ_systemd_service.py'
Dec 01 09:24:39 compute-0 sudo[161723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:39 compute-0 python3.9[161725]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:39 compute-0 sudo[161723]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:40 compute-0 sudo[161876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnzplduehizsgbzmmmwtkzjqapgpiwuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581079.9601223-64-109887256195063/AnsiballZ_systemd_service.py'
Dec 01 09:24:40 compute-0 sudo[161876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:40 compute-0 ceph-mon[75031]: pgmap v381: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:40 compute-0 python3.9[161878]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:24:40 compute-0 sudo[161876]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v382: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:41 compute-0 sudo[162029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlpfyjtshdbxislyypdqfwnvpyqkodnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581080.9656081-116-242691014960384/AnsiballZ_file.py'
Dec 01 09:24:41 compute-0 sudo[162029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:41 compute-0 sshd-session[160173]: Connection closed by 106.75.87.177 port 59194 [preauth]
Dec 01 09:24:41 compute-0 python3.9[162031]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:41 compute-0 sudo[162029]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:42 compute-0 sudo[162181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfdhkosyzlmlgdosrrwyjskjztbnknwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581081.815963-116-205689178247959/AnsiballZ_file.py'
Dec 01 09:24:42 compute-0 sudo[162181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:42 compute-0 python3.9[162183]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:42 compute-0 sudo[162181]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:42 compute-0 ceph-mon[75031]: pgmap v382: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:42 compute-0 sudo[162333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfskfejrwlodotqbxuqnrstdiccbxbsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581082.481762-116-7921384433128/AnsiballZ_file.py'
Dec 01 09:24:42 compute-0 sudo[162333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:42 compute-0 python3.9[162335]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:43 compute-0 sudo[162333]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:24:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:24:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:24:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:24:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:24:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:24:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v383: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:43 compute-0 sudo[162485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyodpkabcrzoeqxmmvshpnuqnopbthsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581083.1497645-116-3795503500184/AnsiballZ_file.py'
Dec 01 09:24:43 compute-0 sudo[162485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:43 compute-0 sudo[162488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:43 compute-0 sudo[162488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:43 compute-0 sudo[162488]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:43 compute-0 sudo[162513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:24:43 compute-0 sudo[162513]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:43 compute-0 sudo[162513]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:43 compute-0 python3.9[162487]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:43 compute-0 sudo[162485]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:43 compute-0 sudo[162538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:43 compute-0 sudo[162538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:43 compute-0 sudo[162538]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:43 compute-0 sudo[162568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:24:43 compute-0 sudo[162568]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:44 compute-0 sudo[162754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhlcjrnkjelzjlykkzsltsolsfmlkujc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581083.783867-116-229897102062417/AnsiballZ_file.py'
Dec 01 09:24:44 compute-0 sudo[162754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:44 compute-0 sudo[162568]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0) v1
Dec 01 09:24:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:24:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:24:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:24:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:24:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:24:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:24:44 compute-0 podman[162756]: 2025-12-01 09:24:44.253744257 +0000 UTC m=+0.162898560 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 01 09:24:44 compute-0 python3.9[162759]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:44 compute-0 sudo[162754]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:24:44 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 9bf0fb69-ad03-41da-9bb2-a1dc5b800832 does not exist
Dec 01 09:24:44 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev a4817f29-25fe-4abc-8f2f-768563498f2e does not exist
Dec 01 09:24:44 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev c89a7ad3-c712-4738-832f-80d64b4a3f34 does not exist
Dec 01 09:24:44 compute-0 sudo[162952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nakkvzksdvxnogalrcxrcdszhbwgnsnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581084.4067175-116-199147834429506/AnsiballZ_file.py'
Dec 01 09:24:44 compute-0 sudo[162952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:24:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:24:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:24:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:24:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:24:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:24:45 compute-0 sudo[162955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:45 compute-0 sudo[162955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:45 compute-0 sudo[162955]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:45 compute-0 ceph-mon[75031]: pgmap v383: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:24:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:24:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:24:45 compute-0 sudo[162980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:24:45 compute-0 sudo[162980]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:45 compute-0 sudo[162980]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v384: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:45 compute-0 sudo[163005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:45 compute-0 sudo[163005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:45 compute-0 sudo[163005]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:45 compute-0 sudo[163030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:24:45 compute-0 sudo[163030]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:45 compute-0 podman[163096]: 2025-12-01 09:24:45.758050067 +0000 UTC m=+0.019870624 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:24:45 compute-0 podman[163096]: 2025-12-01 09:24:45.887868006 +0000 UTC m=+0.149688543 container create 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:24:45 compute-0 systemd[1]: Started libpod-conmon-033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4.scope.
Dec 01 09:24:45 compute-0 python3.9[162954]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:45 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:24:45 compute-0 sudo[162952]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:45 compute-0 podman[163096]: 2025-12-01 09:24:45.970325701 +0000 UTC m=+0.232146238 container init 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:24:45 compute-0 podman[163096]: 2025-12-01 09:24:45.97871852 +0000 UTC m=+0.240539077 container start 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:24:45 compute-0 podman[163096]: 2025-12-01 09:24:45.982588376 +0000 UTC m=+0.244408943 container attach 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec 01 09:24:45 compute-0 busy_williams[163112]: 167 167
Dec 01 09:24:45 compute-0 systemd[1]: libpod-033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4.scope: Deactivated successfully.
Dec 01 09:24:45 compute-0 podman[163096]: 2025-12-01 09:24:45.98419299 +0000 UTC m=+0.246013527 container died 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec 01 09:24:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-3167aad26c8120ddb5130faefb048531f1d49e74dec54f29c524c62e11012c6c-merged.mount: Deactivated successfully.
Dec 01 09:24:46 compute-0 podman[163096]: 2025-12-01 09:24:46.022111157 +0000 UTC m=+0.283931694 container remove 033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_williams, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:24:46 compute-0 systemd[1]: libpod-conmon-033458fdb3514e4ba7f09815535568da0cd50c3fc0e1d5c87bcbe52b466e32f4.scope: Deactivated successfully.
Dec 01 09:24:46 compute-0 podman[163195]: 2025-12-01 09:24:46.177490875 +0000 UTC m=+0.043602763 container create 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:24:46 compute-0 systemd[1]: Started libpod-conmon-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope.
Dec 01 09:24:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:24:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:24:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:24:46 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:24:46 compute-0 ceph-mon[75031]: pgmap v384: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:46 compute-0 podman[163195]: 2025-12-01 09:24:46.161721684 +0000 UTC m=+0.027833592 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:24:46 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:24:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:46 compute-0 podman[163195]: 2025-12-01 09:24:46.277703855 +0000 UTC m=+0.143815783 container init 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:24:46 compute-0 podman[163195]: 2025-12-01 09:24:46.287429811 +0000 UTC m=+0.153541709 container start 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:24:46 compute-0 podman[163195]: 2025-12-01 09:24:46.291023799 +0000 UTC m=+0.157135707 container attach 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:24:46 compute-0 sudo[163305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrthofbasaulownswcdpusazdktsskgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581086.0917597-116-24406351557426/AnsiballZ_file.py'
Dec 01 09:24:46 compute-0 sudo[163305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:46 compute-0 python3.9[163307]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:46 compute-0 sudo[163305]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:47 compute-0 sudo[163465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goqdheznfaytwxcugwbvpeqdbyidnkrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581086.7670963-166-189294265016360/AnsiballZ_file.py'
Dec 01 09:24:47 compute-0 sudo[163465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:47 compute-0 python3.9[163467]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:47 compute-0 sudo[163465]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:47 compute-0 boring_heisenberg[163250]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:24:47 compute-0 boring_heisenberg[163250]: --> relative data size: 1.0
Dec 01 09:24:47 compute-0 boring_heisenberg[163250]: --> All data devices are unavailable
Dec 01 09:24:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v385: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:47 compute-0 systemd[1]: libpod-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope: Deactivated successfully.
Dec 01 09:24:47 compute-0 systemd[1]: libpod-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope: Consumed 1.010s CPU time.
Dec 01 09:24:47 compute-0 podman[163195]: 2025-12-01 09:24:47.350832204 +0000 UTC m=+1.216944102 container died 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:24:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-07f1e020e1a732e2e67bb828616447b0b43cc8581e448dae93d82fd867d7224a-merged.mount: Deactivated successfully.
Dec 01 09:24:47 compute-0 podman[163195]: 2025-12-01 09:24:47.430337128 +0000 UTC m=+1.296449046 container remove 28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_heisenberg, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:24:47 compute-0 systemd[1]: libpod-conmon-28e32c9dc7d74ef1a6dcb24a78d515cd5a121a957a7451c4686793fe60fb9fbb.scope: Deactivated successfully.
Dec 01 09:24:47 compute-0 sudo[163030]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:47 compute-0 sudo[163560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:47 compute-0 sudo[163560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:47 compute-0 sudo[163560]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:47 compute-0 sudo[163604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:24:47 compute-0 sudo[163604]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:47 compute-0 sudo[163604]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:47 compute-0 sudo[163647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:47 compute-0 sudo[163647]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:47 compute-0 sudo[163647]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:47 compute-0 sudo[163744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbuhhpvkpopvvnyretyiolvihuymitso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581087.433786-166-94493171837451/AnsiballZ_file.py'
Dec 01 09:24:47 compute-0 sudo[163744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:47 compute-0 sudo[163699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:24:47 compute-0 sudo[163699]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:47 compute-0 python3.9[163747]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:47 compute-0 sudo[163744]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:48 compute-0 podman[163812]: 2025-12-01 09:24:48.014925471 +0000 UTC m=+0.049192066 container create 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:24:48 compute-0 systemd[1]: Started libpod-conmon-3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf.scope.
Dec 01 09:24:48 compute-0 podman[163812]: 2025-12-01 09:24:47.990948206 +0000 UTC m=+0.025214831 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:24:48 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:24:48 compute-0 podman[163812]: 2025-12-01 09:24:48.103258936 +0000 UTC m=+0.137525531 container init 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:24:48 compute-0 podman[163812]: 2025-12-01 09:24:48.109697552 +0000 UTC m=+0.143964147 container start 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:24:48 compute-0 podman[163812]: 2025-12-01 09:24:48.11288821 +0000 UTC m=+0.147154805 container attach 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:24:48 compute-0 hungry_cartwright[163851]: 167 167
Dec 01 09:24:48 compute-0 systemd[1]: libpod-3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf.scope: Deactivated successfully.
Dec 01 09:24:48 compute-0 podman[163812]: 2025-12-01 09:24:48.11362428 +0000 UTC m=+0.147890875 container died 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:24:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-698419201c78f695b168d0b0f88f7fbb6acfe1260a328e40f46878484a265c4c-merged.mount: Deactivated successfully.
Dec 01 09:24:48 compute-0 podman[163812]: 2025-12-01 09:24:48.149135321 +0000 UTC m=+0.183401916 container remove 3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_cartwright, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:24:48 compute-0 systemd[1]: libpod-conmon-3c4bf980ab491dc2ae2af6e1c25b6e2f8f1004c52ccfbcf377c5912a10e16fdf.scope: Deactivated successfully.
Dec 01 09:24:48 compute-0 sudo[163988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dopewxmwolbstpoceoxqotesvmvgrihp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581088.0311086-166-118417301287342/AnsiballZ_file.py'
Dec 01 09:24:48 compute-0 sudo[163988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:48 compute-0 podman[163957]: 2025-12-01 09:24:48.298954897 +0000 UTC m=+0.039154472 container create 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:24:48 compute-0 systemd[1]: Started libpod-conmon-796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d.scope.
Dec 01 09:24:48 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:24:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:48 compute-0 podman[163957]: 2025-12-01 09:24:48.376132487 +0000 UTC m=+0.116332142 container init 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:24:48 compute-0 podman[163957]: 2025-12-01 09:24:48.281829799 +0000 UTC m=+0.022029404 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:24:48 compute-0 podman[163957]: 2025-12-01 09:24:48.384793504 +0000 UTC m=+0.124993079 container start 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:24:48 compute-0 ceph-mon[75031]: pgmap v385: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:48 compute-0 podman[163957]: 2025-12-01 09:24:48.43256384 +0000 UTC m=+0.172763445 container attach 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:24:48 compute-0 python3.9[163992]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:48 compute-0 sudo[163988]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:48 compute-0 sudo[164158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ancindogwqnhyimmdksgewczkholpyvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581088.647361-166-14855446853841/AnsiballZ_file.py'
Dec 01 09:24:48 compute-0 sudo[164158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:49 compute-0 podman[164123]: 2025-12-01 09:24:49.030134538 +0000 UTC m=+0.119408296 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:24:49 compute-0 youthful_taussig[163995]: {
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:     "0": [
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:         {
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "devices": [
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "/dev/loop3"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             ],
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_name": "ceph_lv0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_size": "21470642176",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "name": "ceph_lv0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "tags": {
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cluster_name": "ceph",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.crush_device_class": "",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.encrypted": "0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osd_id": "0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.type": "block",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.vdo": "0"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             },
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "type": "block",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "vg_name": "ceph_vg0"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:         }
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:     ],
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:     "1": [
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:         {
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "devices": [
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "/dev/loop4"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             ],
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_name": "ceph_lv1",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_size": "21470642176",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "name": "ceph_lv1",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "tags": {
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cluster_name": "ceph",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.crush_device_class": "",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.encrypted": "0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osd_id": "1",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.type": "block",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.vdo": "0"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             },
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "type": "block",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "vg_name": "ceph_vg1"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:         }
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:     ],
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:     "2": [
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:         {
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "devices": [
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "/dev/loop5"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             ],
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_name": "ceph_lv2",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_size": "21470642176",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "name": "ceph_lv2",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "tags": {
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.cluster_name": "ceph",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.crush_device_class": "",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.encrypted": "0",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osd_id": "2",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.type": "block",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:                 "ceph.vdo": "0"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             },
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "type": "block",
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:             "vg_name": "ceph_vg2"
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:         }
Dec 01 09:24:49 compute-0 youthful_taussig[163995]:     ]
Dec 01 09:24:49 compute-0 youthful_taussig[163995]: }
Dec 01 09:24:49 compute-0 systemd[1]: libpod-796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d.scope: Deactivated successfully.
Dec 01 09:24:49 compute-0 podman[163957]: 2025-12-01 09:24:49.178878605 +0000 UTC m=+0.919078190 container died 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:24:49 compute-0 python3.9[164169]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:49 compute-0 sudo[164158]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v386: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-33abc402fbff47d6f65d10ddc4a5ea1eecbbb6fe8e9a4009a6a3e278e41c1946-merged.mount: Deactivated successfully.
Dec 01 09:24:49 compute-0 podman[163957]: 2025-12-01 09:24:49.550847075 +0000 UTC m=+1.291046650 container remove 796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_taussig, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:24:49 compute-0 sudo[164336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghhkcmzoiicdjtgrfclvpwbjlzuiamng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581089.3318858-166-81735784354796/AnsiballZ_file.py'
Dec 01 09:24:49 compute-0 sudo[164336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:49 compute-0 sudo[163699]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:49 compute-0 systemd[1]: libpod-conmon-796ae269f72fee80037eb2ceaf5269c6cb6cc641f8ffccc11c84f097a087418d.scope: Deactivated successfully.
Dec 01 09:24:49 compute-0 sudo[164339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:49 compute-0 sudo[164339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:49 compute-0 sudo[164339]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:49 compute-0 sudo[164364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:24:49 compute-0 sudo[164364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:49 compute-0 sudo[164364]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:49 compute-0 python3.9[164338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:49 compute-0 sudo[164336]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:49 compute-0 sudo[164389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:49 compute-0 sudo[164389]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:49 compute-0 sudo[164389]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:49 compute-0 sudo[164414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:24:49 compute-0 sudo[164414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:50 compute-0 sudo[164640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rehqedxzwqjuleucbunxtpmtnuxzndmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581089.9104924-166-133881693659500/AnsiballZ_file.py'
Dec 01 09:24:50 compute-0 sudo[164640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:50 compute-0 podman[164614]: 2025-12-01 09:24:50.232677137 +0000 UTC m=+0.051907371 container create 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:24:50 compute-0 systemd[1]: Started libpod-conmon-41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de.scope.
Dec 01 09:24:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:24:50 compute-0 podman[164614]: 2025-12-01 09:24:50.203273863 +0000 UTC m=+0.022504077 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:24:50 compute-0 podman[164614]: 2025-12-01 09:24:50.316283393 +0000 UTC m=+0.135513627 container init 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:24:50 compute-0 podman[164614]: 2025-12-01 09:24:50.3267807 +0000 UTC m=+0.146010924 container start 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:24:50 compute-0 podman[164614]: 2025-12-01 09:24:50.330886432 +0000 UTC m=+0.150116666 container attach 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:24:50 compute-0 affectionate_nobel[164647]: 167 167
Dec 01 09:24:50 compute-0 systemd[1]: libpod-41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de.scope: Deactivated successfully.
Dec 01 09:24:50 compute-0 podman[164614]: 2025-12-01 09:24:50.334782088 +0000 UTC m=+0.154012332 container died 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:24:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-f068f6e84383dd90d8e92dbcaa7c77d47b810b45c16d1a9d179be7beb8de76aa-merged.mount: Deactivated successfully.
Dec 01 09:24:50 compute-0 podman[164614]: 2025-12-01 09:24:50.397094952 +0000 UTC m=+0.216325186 container remove 41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_nobel, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:24:50 compute-0 python3.9[164644]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:50 compute-0 systemd[1]: libpod-conmon-41d1541c1ac1e1a2cefa0865c29904e2691053405db5212baa69e45201a259de.scope: Deactivated successfully.
Dec 01 09:24:50 compute-0 sudo[164640]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:50 compute-0 ceph-mon[75031]: pgmap v386: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #24. Immutable memtables: 0.
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.460382) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 24
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090460406, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1524, "num_deletes": 251, "total_data_size": 1671699, "memory_usage": 1699968, "flush_reason": "Manual Compaction"}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #25: started
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090475222, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 25, "file_size": 1629622, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7458, "largest_seqno": 8981, "table_properties": {"data_size": 1622607, "index_size": 4090, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 13908, "raw_average_key_size": 19, "raw_value_size": 1608537, "raw_average_value_size": 2206, "num_data_blocks": 192, "num_entries": 729, "num_filter_entries": 729, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580924, "oldest_key_time": 1764580924, "file_creation_time": 1764581090, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 25, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 14872 microseconds, and 4688 cpu microseconds.
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.475258) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #25: 1629622 bytes OK
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.475271) [db/memtable_list.cc:519] [default] Level-0 commit table #25 started
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477268) [db/memtable_list.cc:722] [default] Level-0 commit table #25: memtable #1 done
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477282) EVENT_LOG_v1 {"time_micros": 1764581090477278, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477317) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1665052, prev total WAL file size 1665052, number of live WAL files 2.
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477823) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [25(1591KB)], [23(4106KB)]
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090477850, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [25], "files_L6": [23], "score": -1, "input_data_size": 5834408, "oldest_snapshot_seqno": -1}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #26: 2786 keys, 4626085 bytes, temperature: kUnknown
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090508870, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 26, "file_size": 4626085, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4605363, "index_size": 12677, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6981, "raw_key_size": 64803, "raw_average_key_size": 23, "raw_value_size": 4553249, "raw_average_value_size": 1634, "num_data_blocks": 570, "num_entries": 2786, "num_filter_entries": 2786, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581090, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.509044) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 4626085 bytes
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.510455) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.8 rd, 148.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 4.0 +0.0 blob) out(4.4 +0.0 blob), read-write-amplify(6.4) write-amplify(2.8) OK, records in: 3300, records dropped: 514 output_compression: NoCompression
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.510470) EVENT_LOG_v1 {"time_micros": 1764581090510462, "job": 8, "event": "compaction_finished", "compaction_time_micros": 31072, "compaction_time_cpu_micros": 12548, "output_level": 6, "num_output_files": 1, "total_output_size": 4626085, "num_input_records": 3300, "num_output_records": 2786, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000025.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090510797, "job": 8, "event": "table_file_deletion", "file_number": 25}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581090511489, "job": 8, "event": "table_file_deletion", "file_number": 23}
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.477757) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511524) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:24:50 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:24:50.511533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:24:50 compute-0 podman[164697]: 2025-12-01 09:24:50.576651011 +0000 UTC m=+0.046183673 container create 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:24:50 compute-0 systemd[1]: Started libpod-conmon-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope.
Dec 01 09:24:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:24:50 compute-0 podman[164697]: 2025-12-01 09:24:50.557173589 +0000 UTC m=+0.026706291 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:24:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:24:50 compute-0 podman[164697]: 2025-12-01 09:24:50.668561494 +0000 UTC m=+0.138094186 container init 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default)
Dec 01 09:24:50 compute-0 podman[164697]: 2025-12-01 09:24:50.679141013 +0000 UTC m=+0.148673685 container start 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:24:50 compute-0 podman[164697]: 2025-12-01 09:24:50.682953998 +0000 UTC m=+0.152486690 container attach 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:24:50 compute-0 sudo[164843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnviuaqpxpbahibsdjrsgapjbkestavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581090.58182-166-51397679827103/AnsiballZ_file.py'
Dec 01 09:24:50 compute-0 sudo[164843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:51 compute-0 python3.9[164845]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:24:51 compute-0 sudo[164843]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v387: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:51 compute-0 sudo[165021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpjchhmjouajoesxlqanragqqlujbtac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581091.3907025-217-102629967569597/AnsiballZ_command.py'
Dec 01 09:24:51 compute-0 sudo[165021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]: {
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "osd_id": 0,
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "type": "bluestore"
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:     },
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "osd_id": 1,
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "type": "bluestore"
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:     },
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "osd_id": 2,
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:         "type": "bluestore"
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]:     }
Dec 01 09:24:51 compute-0 upbeat_goldberg[164745]: }
Dec 01 09:24:51 compute-0 systemd[1]: libpod-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope: Deactivated successfully.
Dec 01 09:24:51 compute-0 podman[164697]: 2025-12-01 09:24:51.752397866 +0000 UTC m=+1.221930548 container died 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:24:51 compute-0 systemd[1]: libpod-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope: Consumed 1.072s CPU time.
Dec 01 09:24:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-0bc67315b39f5de3be56dd265f5c237673c0625fb35a83f434336d9dc37ea5b9-merged.mount: Deactivated successfully.
Dec 01 09:24:51 compute-0 podman[164697]: 2025-12-01 09:24:51.81396116 +0000 UTC m=+1.283493812 container remove 3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:24:51 compute-0 systemd[1]: libpod-conmon-3c6cf13341aa82e96b61c310ffecb34d9eef2663104d27128ee0c160301483da.scope: Deactivated successfully.
Dec 01 09:24:51 compute-0 sudo[164414]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:51 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:24:51 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:24:51 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:24:51 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:24:51 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev ef520ca2-5572-4fa5-8487-4f03d920b2cc does not exist
Dec 01 09:24:51 compute-0 python3.9[165023]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:51 compute-0 sudo[165021]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:51 compute-0 sudo[165037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:24:51 compute-0 sudo[165037]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:51 compute-0 sudo[165037]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:51 compute-0 sudo[165067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:24:51 compute-0 sudo[165067]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:24:51 compute-0 sudo[165067]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:52 compute-0 python3.9[165238]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:24:52 compute-0 ceph-mon[75031]: pgmap v387: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:24:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:24:53 compute-0 sudo[165388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlfwcsscabckspyaogabcqfjybanytci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581092.9362617-235-107063996297093/AnsiballZ_systemd_service.py'
Dec 01 09:24:53 compute-0 sudo[165388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v388: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:53 compute-0 python3.9[165390]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:24:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:53 compute-0 systemd[1]: Reloading.
Dec 01 09:24:53 compute-0 systemd-rc-local-generator[165420]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:24:53 compute-0 systemd-sysv-generator[165425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:24:53 compute-0 sudo[165388]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:54 compute-0 sudo[165576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drvsswslqeipfwakenlzyfqwojfshium ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581094.0701525-243-267791806443528/AnsiballZ_command.py'
Dec 01 09:24:54 compute-0 sudo[165576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:54 compute-0 python3.9[165578]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:54 compute-0 sudo[165576]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:54 compute-0 ceph-mon[75031]: pgmap v388: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:54 compute-0 sudo[165729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwfzdmfkpscohrngkeucimdwmmsjyggx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581094.673955-243-40117951028118/AnsiballZ_command.py'
Dec 01 09:24:54 compute-0 sudo[165729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:55 compute-0 python3.9[165731]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:55 compute-0 sudo[165729]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v389: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:55 compute-0 sudo[165882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhxvbsxiphjvjwjxpsdvitxtnrdpndjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581095.3471532-243-42006859604662/AnsiballZ_command.py'
Dec 01 09:24:55 compute-0 sudo[165882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:55 compute-0 python3.9[165884]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:55 compute-0 sudo[165882]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:56 compute-0 sudo[166035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yalswggnpseoilxzizfrvqkvcbektcmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581096.0204515-243-49402112485488/AnsiballZ_command.py'
Dec 01 09:24:56 compute-0 sudo[166035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:56 compute-0 python3.9[166037]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:56 compute-0 sudo[166035]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:56 compute-0 ceph-mon[75031]: pgmap v389: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:56 compute-0 sudo[166188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiasdriqxblatljycqmjfitggkozqljr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581096.6589453-243-276662417854322/AnsiballZ_command.py'
Dec 01 09:24:56 compute-0 sudo[166188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:57 compute-0 python3.9[166190]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:57 compute-0 sudo[166188]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v390: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:57 compute-0 sudo[166341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtbnodsudubrefpczcnghjnbbdrdmnbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581097.3049884-243-240369196846045/AnsiballZ_command.py'
Dec 01 09:24:57 compute-0 sudo[166341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:57 compute-0 python3.9[166343]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:24:58 compute-0 sudo[166341]: pam_unix(sudo:session): session closed for user root
Dec 01 09:24:58 compute-0 ceph-mon[75031]: pgmap v390: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:59 compute-0 sudo[166494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkvjsdteqakiyglnljbcturbycqpssdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581098.9533138-243-200503571203872/AnsiballZ_command.py'
Dec 01 09:24:59 compute-0 sudo[166494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:24:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v391: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:24:59 compute-0 python3.9[166496]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:24:59 compute-0 sudo[166494]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:00 compute-0 sudo[166647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-airkgwsplxzprsrntecvsgpwatczwdjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581099.846101-297-32950880390087/AnsiballZ_getent.py'
Dec 01 09:25:00 compute-0 sudo[166647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:00 compute-0 python3.9[166649]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 01 09:25:00 compute-0 sudo[166647]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:00 compute-0 ceph-mon[75031]: pgmap v391: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:01 compute-0 sudo[166800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccshzzsskrnzfqwxllxnkemvwjvgyuzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581100.7359695-305-211500970477306/AnsiballZ_group.py'
Dec 01 09:25:01 compute-0 sudo[166800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v392: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:01 compute-0 python3.9[166802]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:25:01 compute-0 groupadd[166803]: group added to /etc/group: name=libvirt, GID=42473
Dec 01 09:25:02 compute-0 groupadd[166803]: group added to /etc/gshadow: name=libvirt
Dec 01 09:25:02 compute-0 groupadd[166803]: new group: name=libvirt, GID=42473
Dec 01 09:25:02 compute-0 sudo[166800]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:02 compute-0 ceph-mon[75031]: pgmap v392: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:03 compute-0 sudo[166958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waobsqzykchrpplurmxcbwladarqstok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581102.5995204-313-251862330016599/AnsiballZ_user.py'
Dec 01 09:25:03 compute-0 sudo[166958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v393: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:03 compute-0 python3.9[166960]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 09:25:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:03 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:25:03 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:25:03 compute-0 useradd[166962]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 01 09:25:03 compute-0 sudo[166958]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:04 compute-0 sudo[167119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrskwpctvvgnirrkgbhvyqmucnloedcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581104.2044277-324-265934289665452/AnsiballZ_setup.py'
Dec 01 09:25:04 compute-0 sudo[167119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:04 compute-0 ceph-mon[75031]: pgmap v393: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:04 compute-0 python3.9[167121]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:25:05 compute-0 sudo[167119]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v394: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:05 compute-0 sudo[167203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vclbkslojywykqedzhlctrbcqssgzsax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581104.2044277-324-265934289665452/AnsiballZ_dnf.py'
Dec 01 09:25:05 compute-0 sudo[167203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:05 compute-0 python3.9[167205]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:25:06 compute-0 ceph-mon[75031]: pgmap v394: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v395: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:08 compute-0 ceph-mon[75031]: pgmap v395: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v396: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:10 compute-0 ceph-mon[75031]: pgmap v396: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v397: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:12 compute-0 ceph-mon[75031]: pgmap v397: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:25:12
Dec 01 09:25:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:25:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:25:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.data', 'images', 'volumes', 'vms', '.mgr', 'cephfs.cephfs.meta']
Dec 01 09:25:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:25:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v398: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:14 compute-0 ceph-mon[75031]: pgmap v398: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:15 compute-0 podman[167216]: 2025-12-01 09:25:15.099798878 +0000 UTC m=+0.179467688 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:25:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v399: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:16 compute-0 ceph-mon[75031]: pgmap v399: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v400: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:25:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:25:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:19 compute-0 ceph-mon[75031]: pgmap v400: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v401: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:20 compute-0 podman[167242]: 2025-12-01 09:25:20.013429559 +0000 UTC m=+0.103560772 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 09:25:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:25:20.456 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:25:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:25:20.458 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:25:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:25:20.458 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:25:21 compute-0 ceph-mon[75031]: pgmap v401: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v402: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:22 compute-0 ceph-mon[75031]: pgmap v402: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v403: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:24 compute-0 ceph-mon[75031]: pgmap v403: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v404: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:26 compute-0 ceph-mon[75031]: pgmap v404: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v405: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:28 compute-0 ceph-mon[75031]: pgmap v405: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v406: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:30 compute-0 ceph-mon[75031]: pgmap v406: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v407: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:32 compute-0 ceph-mon[75031]: pgmap v407: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v408: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:34 compute-0 ceph-mon[75031]: pgmap v408: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v409: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:36 compute-0 ceph-mon[75031]: pgmap v409: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v410: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:38 compute-0 ceph-mon[75031]: pgmap v410: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v411: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:40 compute-0 ceph-mon[75031]: pgmap v411: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v412: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:42 compute-0 ceph-mon[75031]: pgmap v412: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:25:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:25:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:25:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:25:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:25:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:25:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v413: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:44 compute-0 ceph-mon[75031]: pgmap v413: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v414: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:46 compute-0 podman[167261]: 2025-12-01 09:25:46.000887899 +0000 UTC m=+0.102190326 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 01 09:25:46 compute-0 ceph-mon[75031]: pgmap v414: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v415: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:48 compute-0 ceph-mon[75031]: pgmap v415: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v416: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:50 compute-0 ceph-mon[75031]: pgmap v416: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:51 compute-0 podman[167397]: 2025-12-01 09:25:51.000427579 +0000 UTC m=+0.093057853 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 09:25:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v417: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:52 compute-0 sudo[167452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:52 compute-0 sudo[167452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:52 compute-0 sudo[167452]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:52 compute-0 sudo[167480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:25:52 compute-0 sudo[167480]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:52 compute-0 sudo[167480]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:52 compute-0 sudo[167508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:52 compute-0 sudo[167508]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:52 compute-0 sudo[167508]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:52 compute-0 sudo[167535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:25:52 compute-0 sudo[167535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:52 compute-0 sudo[167535]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:25:52 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:25:52 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:25:52 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:25:52 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 21bb6ec5-5182-4dd5-bbd9-91f80a0b53fc does not exist
Dec 01 09:25:52 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 034303ee-3b5a-4ddb-8e9f-d0017971e52e does not exist
Dec 01 09:25:52 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 50088751-42b1-4bef-bae5-606d374c5981 does not exist
Dec 01 09:25:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:25:52 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:25:52 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:25:52 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: pgmap v417: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:25:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:25:52 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:25:52 compute-0 sudo[167609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:52 compute-0 sudo[167609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:52 compute-0 sudo[167609]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:52 compute-0 sudo[167634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:25:52 compute-0 sudo[167634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:52 compute-0 sudo[167634]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:53 compute-0 sudo[167659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:53 compute-0 sudo[167659]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:53 compute-0 sudo[167659]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:53 compute-0 sudo[167684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:25:53 compute-0 sudo[167684]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v418: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:53 compute-0 podman[167749]: 2025-12-01 09:25:53.445983603 +0000 UTC m=+0.037752503 container create aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:25:53 compute-0 systemd[1]: Started libpod-conmon-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope.
Dec 01 09:25:53 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:25:53 compute-0 podman[167749]: 2025-12-01 09:25:53.523618138 +0000 UTC m=+0.115387058 container init aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True)
Dec 01 09:25:53 compute-0 podman[167749]: 2025-12-01 09:25:53.430102673 +0000 UTC m=+0.021871593 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:25:53 compute-0 podman[167749]: 2025-12-01 09:25:53.531738273 +0000 UTC m=+0.123507173 container start aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:25:53 compute-0 podman[167749]: 2025-12-01 09:25:53.535243864 +0000 UTC m=+0.127012784 container attach aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:25:53 compute-0 festive_meninsky[167765]: 167 167
Dec 01 09:25:53 compute-0 systemd[1]: libpod-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope: Deactivated successfully.
Dec 01 09:25:53 compute-0 conmon[167765]: conmon aeb4c9b42d061b7849a7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope/container/memory.events
Dec 01 09:25:53 compute-0 podman[167749]: 2025-12-01 09:25:53.538163119 +0000 UTC m=+0.129932029 container died aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:25:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-30cdbac835516680dadd3061d3d227f72a445f657746992042900490d6e0481f-merged.mount: Deactivated successfully.
Dec 01 09:25:53 compute-0 podman[167749]: 2025-12-01 09:25:53.641349583 +0000 UTC m=+0.233118513 container remove aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_meninsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:25:53 compute-0 systemd[1]: libpod-conmon-aeb4c9b42d061b7849a72843e5146aa8de6bfa9411b9bcde7dd7edf96b73555b.scope: Deactivated successfully.
Dec 01 09:25:53 compute-0 podman[167789]: 2025-12-01 09:25:53.895204346 +0000 UTC m=+0.111087084 container create 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:25:53 compute-0 podman[167789]: 2025-12-01 09:25:53.812883395 +0000 UTC m=+0.028766153 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:25:53 compute-0 systemd[1]: Started libpod-conmon-93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f.scope.
Dec 01 09:25:53 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:25:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:53 compute-0 podman[167789]: 2025-12-01 09:25:53.991409658 +0000 UTC m=+0.207292416 container init 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:25:53 compute-0 podman[167789]: 2025-12-01 09:25:53.997855405 +0000 UTC m=+0.213738143 container start 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:25:54 compute-0 podman[167789]: 2025-12-01 09:25:54.000849261 +0000 UTC m=+0.216732019 container attach 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:25:54 compute-0 ceph-mon[75031]: pgmap v418: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:55 compute-0 gifted_mirzakhani[167805]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:25:55 compute-0 gifted_mirzakhani[167805]: --> relative data size: 1.0
Dec 01 09:25:55 compute-0 gifted_mirzakhani[167805]: --> All data devices are unavailable
Dec 01 09:25:55 compute-0 systemd[1]: libpod-93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f.scope: Deactivated successfully.
Dec 01 09:25:55 compute-0 podman[167789]: 2025-12-01 09:25:55.061391876 +0000 UTC m=+1.277274624 container died 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:25:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-94e4ad049bf3f3c465a5d7e940dcd7307c8da0580480d239edde753ba9ba3c32-merged.mount: Deactivated successfully.
Dec 01 09:25:55 compute-0 podman[167789]: 2025-12-01 09:25:55.302198941 +0000 UTC m=+1.518081689 container remove 93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mirzakhani, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:25:55 compute-0 systemd[1]: libpod-conmon-93aceb1871dc14fba0ab36bb0024881890b6bfb0a34b20709e85ee7d3aae568f.scope: Deactivated successfully.
Dec 01 09:25:55 compute-0 sudo[167684]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v419: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:55 compute-0 sudo[167846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:55 compute-0 sudo[167846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:55 compute-0 sudo[167846]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:55 compute-0 sudo[167871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:25:55 compute-0 sudo[167871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:55 compute-0 sudo[167871]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:55 compute-0 sudo[167896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:55 compute-0 sudo[167896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:55 compute-0 sudo[167896]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:55 compute-0 sudo[167921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:25:55 compute-0 sudo[167921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:55 compute-0 podman[167986]: 2025-12-01 09:25:55.9196633 +0000 UTC m=+0.041445370 container create 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:25:55 compute-0 systemd[1]: Started libpod-conmon-8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167.scope.
Dec 01 09:25:55 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:25:55 compute-0 podman[167986]: 2025-12-01 09:25:55.900199347 +0000 UTC m=+0.021981457 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:25:56 compute-0 podman[167986]: 2025-12-01 09:25:56.007271644 +0000 UTC m=+0.129053734 container init 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:25:56 compute-0 podman[167986]: 2025-12-01 09:25:56.01404482 +0000 UTC m=+0.135826890 container start 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:25:56 compute-0 awesome_chandrasekhar[168002]: 167 167
Dec 01 09:25:56 compute-0 podman[167986]: 2025-12-01 09:25:56.017534821 +0000 UTC m=+0.139316921 container attach 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:25:56 compute-0 systemd[1]: libpod-8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167.scope: Deactivated successfully.
Dec 01 09:25:56 compute-0 podman[167986]: 2025-12-01 09:25:56.020365753 +0000 UTC m=+0.142147853 container died 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:25:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ccf423937655d59ed2d7f28e51d92fdd822d44310e62c3ce684e787631998c7-merged.mount: Deactivated successfully.
Dec 01 09:25:56 compute-0 podman[167986]: 2025-12-01 09:25:56.05690912 +0000 UTC m=+0.178691190 container remove 8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=awesome_chandrasekhar, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:25:56 compute-0 systemd[1]: libpod-conmon-8db0234334be7b1e0f76d0b27d6b4fba0e19812bbfc67d551ff8da5801469167.scope: Deactivated successfully.
Dec 01 09:25:56 compute-0 podman[168028]: 2025-12-01 09:25:56.253673071 +0000 UTC m=+0.063428916 container create cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:25:56 compute-0 systemd[1]: Started libpod-conmon-cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02.scope.
Dec 01 09:25:56 compute-0 podman[168028]: 2025-12-01 09:25:56.233895269 +0000 UTC m=+0.043651094 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:25:56 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:25:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:56 compute-0 podman[168028]: 2025-12-01 09:25:56.35493209 +0000 UTC m=+0.164687935 container init cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:25:56 compute-0 podman[168028]: 2025-12-01 09:25:56.363842257 +0000 UTC m=+0.173598072 container start cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:25:56 compute-0 podman[168028]: 2025-12-01 09:25:56.366852644 +0000 UTC m=+0.176608539 container attach cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:25:56 compute-0 ceph-mon[75031]: pgmap v419: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:57 compute-0 modest_mahavira[168044]: {
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:     "0": [
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:         {
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "devices": [
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "/dev/loop3"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             ],
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_name": "ceph_lv0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_size": "21470642176",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "name": "ceph_lv0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "tags": {
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cluster_name": "ceph",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.crush_device_class": "",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.encrypted": "0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osd_id": "0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.type": "block",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.vdo": "0"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             },
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "type": "block",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "vg_name": "ceph_vg0"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:         }
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:     ],
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:     "1": [
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:         {
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "devices": [
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "/dev/loop4"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             ],
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_name": "ceph_lv1",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_size": "21470642176",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "name": "ceph_lv1",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "tags": {
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cluster_name": "ceph",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.crush_device_class": "",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.encrypted": "0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osd_id": "1",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.type": "block",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.vdo": "0"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             },
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "type": "block",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "vg_name": "ceph_vg1"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:         }
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:     ],
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:     "2": [
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:         {
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "devices": [
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "/dev/loop5"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             ],
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_name": "ceph_lv2",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_size": "21470642176",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "name": "ceph_lv2",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "tags": {
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.cluster_name": "ceph",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.crush_device_class": "",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.encrypted": "0",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osd_id": "2",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.type": "block",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:                 "ceph.vdo": "0"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             },
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "type": "block",
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:             "vg_name": "ceph_vg2"
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:         }
Dec 01 09:25:57 compute-0 modest_mahavira[168044]:     ]
Dec 01 09:25:57 compute-0 modest_mahavira[168044]: }
Dec 01 09:25:57 compute-0 systemd[1]: libpod-cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02.scope: Deactivated successfully.
Dec 01 09:25:57 compute-0 podman[168028]: 2025-12-01 09:25:57.12213042 +0000 UTC m=+0.931886225 container died cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:25:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v420: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-75da4ca3d8fd2c9c0f30ae86f73b2cb52eeb0746162362cde4d27f33d335fefb-merged.mount: Deactivated successfully.
Dec 01 09:25:57 compute-0 podman[168028]: 2025-12-01 09:25:57.387620509 +0000 UTC m=+1.197376324 container remove cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_mahavira, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:25:57 compute-0 systemd[1]: libpod-conmon-cdff25bbfeddcb3e361746abb776521eb3b5adc71b6f26911420a4be6cb6cd02.scope: Deactivated successfully.
Dec 01 09:25:57 compute-0 sudo[167921]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:57 compute-0 sudo[168065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:57 compute-0 sudo[168065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:57 compute-0 sudo[168065]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:57 compute-0 sudo[168090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:25:57 compute-0 sudo[168090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:57 compute-0 sudo[168090]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:57 compute-0 sudo[168115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:57 compute-0 sudo[168115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:57 compute-0 sudo[168115]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:57 compute-0 sudo[168141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:25:57 compute-0 sudo[168141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:58 compute-0 podman[168208]: 2025-12-01 09:25:58.028930818 +0000 UTC m=+0.048065781 container create e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:25:58 compute-0 systemd[1]: Started libpod-conmon-e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c.scope.
Dec 01 09:25:58 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:25:58 compute-0 podman[168208]: 2025-12-01 09:25:58.008429285 +0000 UTC m=+0.027564278 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:25:58 compute-0 podman[168208]: 2025-12-01 09:25:58.112370451 +0000 UTC m=+0.131505434 container init e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:25:58 compute-0 podman[168208]: 2025-12-01 09:25:58.118999603 +0000 UTC m=+0.138134606 container start e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Dec 01 09:25:58 compute-0 podman[168208]: 2025-12-01 09:25:58.123442342 +0000 UTC m=+0.142577325 container attach e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:25:58 compute-0 heuristic_northcutt[168226]: 167 167
Dec 01 09:25:58 compute-0 systemd[1]: libpod-e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c.scope: Deactivated successfully.
Dec 01 09:25:58 compute-0 podman[168208]: 2025-12-01 09:25:58.125980945 +0000 UTC m=+0.145115988 container died e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:25:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-5e0e0cdc0576956bfe5e149fc42e6c3b1fc283e1069f18e8cc30f68e0fa08ba1-merged.mount: Deactivated successfully.
Dec 01 09:25:58 compute-0 podman[168208]: 2025-12-01 09:25:58.180991516 +0000 UTC m=+0.200126479 container remove e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_northcutt, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:25:58 compute-0 systemd[1]: libpod-conmon-e2e659ff3008ea08dbff8713e2075dd3c6ebe8d90bcf43c648ad12aae953c83c.scope: Deactivated successfully.
Dec 01 09:25:58 compute-0 podman[168252]: 2025-12-01 09:25:58.360229461 +0000 UTC m=+0.051863372 container create 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:25:58 compute-0 systemd[1]: Started libpod-conmon-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope.
Dec 01 09:25:58 compute-0 podman[168252]: 2025-12-01 09:25:58.334092015 +0000 UTC m=+0.025725956 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:25:58 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:25:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:25:58 compute-0 podman[168252]: 2025-12-01 09:25:58.447047452 +0000 UTC m=+0.138681383 container init 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:25:58 compute-0 podman[168252]: 2025-12-01 09:25:58.456149115 +0000 UTC m=+0.147783016 container start 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:25:58 compute-0 podman[168252]: 2025-12-01 09:25:58.461586422 +0000 UTC m=+0.153220353 container attach 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec 01 09:25:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:25:58 compute-0 ceph-mon[75031]: pgmap v420: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v421: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]: {
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "osd_id": 0,
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "type": "bluestore"
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:     },
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "osd_id": 1,
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "type": "bluestore"
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:     },
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "osd_id": 2,
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:         "type": "bluestore"
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]:     }
Dec 01 09:25:59 compute-0 blissful_sutherland[168268]: }
Dec 01 09:25:59 compute-0 systemd[1]: libpod-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope: Deactivated successfully.
Dec 01 09:25:59 compute-0 podman[168252]: 2025-12-01 09:25:59.50841763 +0000 UTC m=+1.200051541 container died 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:25:59 compute-0 systemd[1]: libpod-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope: Consumed 1.047s CPU time.
Dec 01 09:25:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-dc596e7b0cf8f8b470c9654b6e318c4e5ef2ea1d60dd0c4bd54ffaa77a715665-merged.mount: Deactivated successfully.
Dec 01 09:25:59 compute-0 podman[168252]: 2025-12-01 09:25:59.567776067 +0000 UTC m=+1.259409958 container remove 906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:25:59 compute-0 systemd[1]: libpod-conmon-906fb157a2f7ac3e305ff8a6d410af6b8b3559ec87fad44f83bed6225dbd0165.scope: Deactivated successfully.
Dec 01 09:25:59 compute-0 sudo[168141]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:25:59 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:25:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:25:59 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:25:59 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 1f41a2df-dd36-42b5-8782-cff7b110fc0c does not exist
Dec 01 09:25:59 compute-0 sudo[168314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:25:59 compute-0 sudo[168314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:59 compute-0 sudo[168314]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:59 compute-0 sudo[168340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:25:59 compute-0 sudo[168340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:25:59 compute-0 sudo[168340]: pam_unix(sudo:session): session closed for user root
Dec 01 09:26:00 compute-0 ceph-mon[75031]: pgmap v421: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:00 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:26:00 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:26:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v422: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:02 compute-0 ceph-mon[75031]: pgmap v422: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v423: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:04 compute-0 ceph-mon[75031]: pgmap v423: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v424: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:06 compute-0 ceph-mon[75031]: pgmap v424: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v425: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:08 compute-0 kernel: SELinux:  Converting 2768 SID table entries...
Dec 01 09:26:08 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:26:08 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:26:08 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:26:08 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:26:08 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:26:08 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:26:08 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:26:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:08 compute-0 ceph-mon[75031]: pgmap v425: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v426: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:10 compute-0 ceph-mon[75031]: pgmap v426: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v427: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:12 compute-0 ceph-mon[75031]: pgmap v427: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:26:12
Dec 01 09:26:12 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:26:12 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:26:12 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'vms', 'cephfs.cephfs.data', 'images', 'cephfs.cephfs.meta', '.mgr', 'volumes']
Dec 01 09:26:12 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:26:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v428: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:14 compute-0 ceph-mon[75031]: pgmap v428: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v429: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:16 compute-0 ceph-mon[75031]: pgmap v429: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:16 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 01 09:26:17 compute-0 podman[168373]: 2025-12-01 09:26:17.024107627 +0000 UTC m=+0.106891622 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 09:26:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v430: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:18 compute-0 kernel: SELinux:  Converting 2768 SID table entries...
Dec 01 09:26:18 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:26:18 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:26:18 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:26:18 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:26:18 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:26:18 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:26:18 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:26:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:26:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:18 compute-0 ceph-mon[75031]: pgmap v430: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v431: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:26:20.458 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:26:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:26:20.460 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:26:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:26:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:26:21 compute-0 ceph-mon[75031]: pgmap v431: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v432: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:21 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 01 09:26:21 compute-0 podman[168406]: 2025-12-01 09:26:21.979442674 +0000 UTC m=+0.078445510 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:26:23 compute-0 ceph-mon[75031]: pgmap v432: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v433: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:25 compute-0 ceph-mon[75031]: pgmap v433: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v434: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:27 compute-0 ceph-mon[75031]: pgmap v434: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v435: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:29 compute-0 ceph-mon[75031]: pgmap v435: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v436: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:31 compute-0 ceph-mon[75031]: pgmap v436: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v437: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:33 compute-0 ceph-mon[75031]: pgmap v437: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v438: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:34 compute-0 ceph-mon[75031]: pgmap v438: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v439: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:36 compute-0 ceph-mon[75031]: pgmap v439: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v440: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:38 compute-0 ceph-mon[75031]: pgmap v440: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v441: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:40 compute-0 ceph-mon[75031]: pgmap v441: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v442: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:42 compute-0 ceph-mon[75031]: pgmap v442: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:26:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:26:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:26:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:26:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:26:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:26:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v443: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:44 compute-0 ceph-mon[75031]: pgmap v443: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v444: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:46 compute-0 ceph-mon[75031]: pgmap v444: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v445: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:48 compute-0 podman[179345]: 2025-12-01 09:26:48.005635473 +0000 UTC m=+0.103195745 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:26:48 compute-0 ceph-mon[75031]: pgmap v445: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v446: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:50 compute-0 ceph-mon[75031]: pgmap v446: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v447: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:52 compute-0 ceph-mon[75031]: pgmap v447: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:52 compute-0 podman[182586]: 2025-12-01 09:26:52.994099237 +0000 UTC m=+0.084049842 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 09:26:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v448: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:55 compute-0 ceph-mon[75031]: pgmap v448: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v449: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:57 compute-0 ceph-mon[75031]: pgmap v449: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v450: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:26:59 compute-0 ceph-mon[75031]: pgmap v450: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v451: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:26:59 compute-0 sudo[185265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:26:59 compute-0 sudo[185265]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:26:59 compute-0 sudo[185265]: pam_unix(sudo:session): session closed for user root
Dec 01 09:26:59 compute-0 sudo[185290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:26:59 compute-0 sudo[185290]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:26:59 compute-0 sudo[185290]: pam_unix(sudo:session): session closed for user root
Dec 01 09:26:59 compute-0 sudo[185315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:26:59 compute-0 sudo[185315]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:26:59 compute-0 sudo[185315]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:00 compute-0 sudo[185340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:27:00 compute-0 sudo[185340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:00 compute-0 ceph-mon[75031]: pgmap v451: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:00 compute-0 podman[185436]: 2025-12-01 09:27:00.456479476 +0000 UTC m=+0.066412082 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:27:00 compute-0 podman[185436]: 2025-12-01 09:27:00.565620743 +0000 UTC m=+0.175553359 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:27:01 compute-0 sudo[185340]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:01 compute-0 sudo[185583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:01 compute-0 sudo[185583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:01 compute-0 sudo[185583]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:01 compute-0 sudo[185608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:27:01 compute-0 sudo[185608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:01 compute-0 sudo[185608]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:01 compute-0 sudo[185633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:01 compute-0 sudo[185633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:01 compute-0 sudo[185633]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:01 compute-0 sudo[185658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:27:01 compute-0 sudo[185658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v452: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:01 compute-0 sudo[185658]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:01 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 29faa56f-d14b-410f-9fb2-68c3bc466846 does not exist
Dec 01 09:27:01 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 7a9dc25b-66b8-4eec-840c-b4ca185931e2 does not exist
Dec 01 09:27:01 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 2d3bfa0d-99f2-46eb-9626-708bd8cde677 does not exist
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:27:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:27:01 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:27:01 compute-0 sudo[185716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:01 compute-0 sudo[185716]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:01 compute-0 sudo[185716]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:01 compute-0 sudo[185741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:27:01 compute-0 sudo[185741]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:01 compute-0 sudo[185741]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:01 compute-0 sudo[185766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:01 compute-0 sudo[185766]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:01 compute-0 sudo[185766]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:02 compute-0 sudo[185791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:27:02 compute-0 sudo[185791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:27:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:27:02 compute-0 podman[185854]: 2025-12-01 09:27:02.351021314 +0000 UTC m=+0.045310402 container create 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:27:02 compute-0 systemd[1]: Started libpod-conmon-1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc.scope.
Dec 01 09:27:02 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:27:02 compute-0 podman[185854]: 2025-12-01 09:27:02.330334995 +0000 UTC m=+0.024624173 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:27:02 compute-0 podman[185854]: 2025-12-01 09:27:02.437789973 +0000 UTC m=+0.132079091 container init 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:27:02 compute-0 podman[185854]: 2025-12-01 09:27:02.446729762 +0000 UTC m=+0.141018860 container start 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:27:02 compute-0 podman[185854]: 2025-12-01 09:27:02.450636205 +0000 UTC m=+0.144925303 container attach 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:27:02 compute-0 priceless_engelbart[185871]: 167 167
Dec 01 09:27:02 compute-0 systemd[1]: libpod-1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc.scope: Deactivated successfully.
Dec 01 09:27:02 compute-0 podman[185854]: 2025-12-01 09:27:02.463153887 +0000 UTC m=+0.157442985 container died 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:27:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-0f6a05b6348241c82bec9aa6285ca3f4791a4195c81e0ade4b11b037687d1b96-merged.mount: Deactivated successfully.
Dec 01 09:27:02 compute-0 podman[185854]: 2025-12-01 09:27:02.502744502 +0000 UTC m=+0.197033600 container remove 1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_engelbart, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec 01 09:27:02 compute-0 systemd[1]: libpod-conmon-1854b091dcf2a491688262d08516cb1f544ca23f306e9e819d46f37e35d48ccc.scope: Deactivated successfully.
Dec 01 09:27:02 compute-0 podman[185895]: 2025-12-01 09:27:02.673206532 +0000 UTC m=+0.041488251 container create 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:27:02 compute-0 systemd[1]: Started libpod-conmon-6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b.scope.
Dec 01 09:27:02 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:27:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:02 compute-0 podman[185895]: 2025-12-01 09:27:02.657180859 +0000 UTC m=+0.025462598 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:27:02 compute-0 podman[185895]: 2025-12-01 09:27:02.7678537 +0000 UTC m=+0.136135449 container init 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:27:02 compute-0 podman[185895]: 2025-12-01 09:27:02.776876071 +0000 UTC m=+0.145157790 container start 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:27:02 compute-0 podman[185895]: 2025-12-01 09:27:02.780444594 +0000 UTC m=+0.148726323 container attach 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3)
Dec 01 09:27:03 compute-0 ceph-mon[75031]: pgmap v452: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v453: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:03 compute-0 strange_lalande[185913]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:27:03 compute-0 strange_lalande[185913]: --> relative data size: 1.0
Dec 01 09:27:03 compute-0 strange_lalande[185913]: --> All data devices are unavailable
Dec 01 09:27:03 compute-0 systemd[1]: libpod-6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b.scope: Deactivated successfully.
Dec 01 09:27:03 compute-0 podman[185895]: 2025-12-01 09:27:03.868908676 +0000 UTC m=+1.237190395 container died 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:27:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-917b0b7a1b44b3901c5258f8f72eaabbc0a11c2760c493258442fa9ec47b1ad3-merged.mount: Deactivated successfully.
Dec 01 09:27:03 compute-0 podman[185895]: 2025-12-01 09:27:03.916794861 +0000 UTC m=+1.285076580 container remove 6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_lalande, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:27:03 compute-0 systemd[1]: libpod-conmon-6491c76307312d2a3e57be44546880e51358186c43c3aa0ef8719d11d4d4494b.scope: Deactivated successfully.
Dec 01 09:27:03 compute-0 sudo[185791]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:04 compute-0 sudo[185956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:04 compute-0 sudo[185956]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:04 compute-0 sudo[185956]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:04 compute-0 sudo[185981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:27:04 compute-0 sudo[185981]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:04 compute-0 sudo[185981]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:04 compute-0 sudo[186006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:04 compute-0 sudo[186006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:04 compute-0 sudo[186006]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:04 compute-0 sudo[186031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:27:04 compute-0 sudo[186031]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:04 compute-0 ceph-mon[75031]: pgmap v453: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:04 compute-0 podman[186096]: 2025-12-01 09:27:04.612525024 +0000 UTC m=+0.044265571 container create 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:27:04 compute-0 systemd[1]: Started libpod-conmon-5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a.scope.
Dec 01 09:27:04 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:27:04 compute-0 podman[186096]: 2025-12-01 09:27:04.675367062 +0000 UTC m=+0.107107619 container init 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:27:04 compute-0 podman[186096]: 2025-12-01 09:27:04.68118433 +0000 UTC m=+0.112924877 container start 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:27:04 compute-0 podman[186096]: 2025-12-01 09:27:04.684438564 +0000 UTC m=+0.116179111 container attach 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:27:04 compute-0 practical_mirzakhani[186112]: 167 167
Dec 01 09:27:04 compute-0 podman[186096]: 2025-12-01 09:27:04.58957807 +0000 UTC m=+0.021318637 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:27:04 compute-0 systemd[1]: libpod-5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a.scope: Deactivated successfully.
Dec 01 09:27:04 compute-0 podman[186096]: 2025-12-01 09:27:04.68773948 +0000 UTC m=+0.119480037 container died 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 01 09:27:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-7e265f55a7e5e888b92e8055db8127a08e2cdcef43cce6fbb9c404baa884e63b-merged.mount: Deactivated successfully.
Dec 01 09:27:04 compute-0 podman[186096]: 2025-12-01 09:27:04.724551834 +0000 UTC m=+0.156292381 container remove 5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:27:04 compute-0 systemd[1]: libpod-conmon-5bb3638ac6108637b25eba6dd046b4e0bca9fac43202c88fa561bfb05dba5c9a.scope: Deactivated successfully.
Dec 01 09:27:04 compute-0 podman[186135]: 2025-12-01 09:27:04.885199861 +0000 UTC m=+0.043363535 container create 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:27:04 compute-0 systemd[1]: Started libpod-conmon-9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764.scope.
Dec 01 09:27:04 compute-0 podman[186135]: 2025-12-01 09:27:04.86579338 +0000 UTC m=+0.023957074 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:27:04 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:27:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:05 compute-0 podman[186135]: 2025-12-01 09:27:05.074881367 +0000 UTC m=+0.233045091 container init 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:27:05 compute-0 podman[186135]: 2025-12-01 09:27:05.082680693 +0000 UTC m=+0.240844377 container start 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 01 09:27:05 compute-0 podman[186135]: 2025-12-01 09:27:05.132824813 +0000 UTC m=+0.290988487 container attach 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:27:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v454: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]: {
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:     "0": [
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:         {
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "devices": [
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "/dev/loop3"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             ],
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_name": "ceph_lv0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_size": "21470642176",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "name": "ceph_lv0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "tags": {
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cluster_name": "ceph",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.crush_device_class": "",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.encrypted": "0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osd_id": "0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.type": "block",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.vdo": "0"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             },
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "type": "block",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "vg_name": "ceph_vg0"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:         }
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:     ],
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:     "1": [
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:         {
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "devices": [
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "/dev/loop4"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             ],
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_name": "ceph_lv1",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_size": "21470642176",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "name": "ceph_lv1",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "tags": {
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cluster_name": "ceph",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.crush_device_class": "",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.encrypted": "0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osd_id": "1",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.type": "block",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.vdo": "0"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             },
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "type": "block",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "vg_name": "ceph_vg1"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:         }
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:     ],
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:     "2": [
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:         {
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "devices": [
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "/dev/loop5"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             ],
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_name": "ceph_lv2",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_size": "21470642176",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "name": "ceph_lv2",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "tags": {
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.cluster_name": "ceph",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.crush_device_class": "",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.encrypted": "0",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osd_id": "2",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.type": "block",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:                 "ceph.vdo": "0"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             },
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "type": "block",
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:             "vg_name": "ceph_vg2"
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:         }
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]:     ]
Dec 01 09:27:05 compute-0 suspicious_hodgkin[186151]: }
Dec 01 09:27:05 compute-0 systemd[1]: libpod-9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764.scope: Deactivated successfully.
Dec 01 09:27:05 compute-0 podman[186160]: 2025-12-01 09:27:05.898896231 +0000 UTC m=+0.022622495 container died 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:27:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-f415c2420a5d0d5217afe86c70c2d2f7d50061b932649db1b4578bc1a581f03b-merged.mount: Deactivated successfully.
Dec 01 09:27:05 compute-0 podman[186160]: 2025-12-01 09:27:05.95139792 +0000 UTC m=+0.075124154 container remove 9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hodgkin, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:27:05 compute-0 systemd[1]: libpod-conmon-9c28e69c2532e9af2e20b3d0e15ecc3867a70ecf4a80b3f5ecc208361032f764.scope: Deactivated successfully.
Dec 01 09:27:06 compute-0 sudo[186031]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:06 compute-0 sudo[186173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:06 compute-0 sudo[186173]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:06 compute-0 sudo[186173]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:06 compute-0 sudo[186198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:27:06 compute-0 sudo[186198]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:06 compute-0 sudo[186198]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:06 compute-0 sudo[186223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:06 compute-0 sudo[186223]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:06 compute-0 sudo[186223]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:06 compute-0 sudo[186248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:27:06 compute-0 sudo[186248]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:06 compute-0 ceph-mon[75031]: pgmap v454: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:06 compute-0 podman[186312]: 2025-12-01 09:27:06.615412815 +0000 UTC m=+0.042661475 container create 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:27:06 compute-0 systemd[1]: Started libpod-conmon-45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7.scope.
Dec 01 09:27:06 compute-0 podman[186312]: 2025-12-01 09:27:06.596036425 +0000 UTC m=+0.023285085 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:27:06 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:27:06 compute-0 podman[186312]: 2025-12-01 09:27:06.751583664 +0000 UTC m=+0.178832394 container init 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:27:06 compute-0 podman[186312]: 2025-12-01 09:27:06.758854644 +0000 UTC m=+0.186103284 container start 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:27:06 compute-0 bold_allen[186330]: 167 167
Dec 01 09:27:06 compute-0 systemd[1]: libpod-45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7.scope: Deactivated successfully.
Dec 01 09:27:06 compute-0 podman[186312]: 2025-12-01 09:27:06.774126986 +0000 UTC m=+0.201375676 container attach 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:27:06 compute-0 podman[186312]: 2025-12-01 09:27:06.774564969 +0000 UTC m=+0.201813619 container died 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:27:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-2a1baba79d5f0ff6b8549bcb836717ea9ff79ac964811f1e515784246bae9f70-merged.mount: Deactivated successfully.
Dec 01 09:27:06 compute-0 podman[186312]: 2025-12-01 09:27:06.828712234 +0000 UTC m=+0.255960904 container remove 45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_allen, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:27:06 compute-0 systemd[1]: libpod-conmon-45086fd5b3c2315d688602e57e127f2e7d3722bdd8420b107f8bcb6e730ceaf7.scope: Deactivated successfully.
Dec 01 09:27:07 compute-0 podman[186354]: 2025-12-01 09:27:07.033168777 +0000 UTC m=+0.031026328 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:27:07 compute-0 podman[186354]: 2025-12-01 09:27:07.170427778 +0000 UTC m=+0.168285259 container create c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:27:07 compute-0 systemd[1]: Started libpod-conmon-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope.
Dec 01 09:27:07 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:27:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v455: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:07 compute-0 podman[186354]: 2025-12-01 09:27:07.402081618 +0000 UTC m=+0.399939139 container init c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:27:07 compute-0 podman[186354]: 2025-12-01 09:27:07.409343638 +0000 UTC m=+0.407201159 container start c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:27:07 compute-0 podman[186354]: 2025-12-01 09:27:07.41427363 +0000 UTC m=+0.412131161 container attach c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:27:08 compute-0 frosty_gould[186370]: {
Dec 01 09:27:08 compute-0 frosty_gould[186370]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "osd_id": 0,
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "type": "bluestore"
Dec 01 09:27:08 compute-0 frosty_gould[186370]:     },
Dec 01 09:27:08 compute-0 frosty_gould[186370]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "osd_id": 1,
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "type": "bluestore"
Dec 01 09:27:08 compute-0 frosty_gould[186370]:     },
Dec 01 09:27:08 compute-0 frosty_gould[186370]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "osd_id": 2,
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:27:08 compute-0 frosty_gould[186370]:         "type": "bluestore"
Dec 01 09:27:08 compute-0 frosty_gould[186370]:     }
Dec 01 09:27:08 compute-0 frosty_gould[186370]: }
Dec 01 09:27:08 compute-0 ceph-mon[75031]: pgmap v455: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:08 compute-0 systemd[1]: libpod-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope: Deactivated successfully.
Dec 01 09:27:08 compute-0 podman[186354]: 2025-12-01 09:27:08.575719484 +0000 UTC m=+1.573576965 container died c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:27:08 compute-0 systemd[1]: libpod-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope: Consumed 1.168s CPU time.
Dec 01 09:27:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-b4ecbed6342a885750bd179fd6e98adf5bc756f01e689bc6b91df3dcc39b24de-merged.mount: Deactivated successfully.
Dec 01 09:27:08 compute-0 podman[186354]: 2025-12-01 09:27:08.735509976 +0000 UTC m=+1.733367477 container remove c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_gould, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:27:08 compute-0 systemd[1]: libpod-conmon-c951206b456534157b0f69837a5d60d6d30c94e2b096902e796cdaac9d5ae14a.scope: Deactivated successfully.
Dec 01 09:27:08 compute-0 sudo[186248]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:27:08 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:27:08 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:08 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 84ff2688-aca2-48d2-a896-5341958d83c0 does not exist
Dec 01 09:27:08 compute-0 sudo[186415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:27:08 compute-0 sudo[186415]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:08 compute-0 sudo[186415]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:09 compute-0 sudo[186440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:27:09 compute-0 sudo[186440]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:27:09 compute-0 sudo[186440]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v456: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:09 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:09 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:27:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v457: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:11 compute-0 ceph-mon[75031]: pgmap v456: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:11 compute-0 kernel: SELinux:  Converting 2769 SID table entries...
Dec 01 09:27:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:27:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:27:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:27:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:27:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:27:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:27:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:27:12 compute-0 ceph-mon[75031]: pgmap v457: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:27:12
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['vms', '.mgr', 'volumes', 'backups', 'images', 'cephfs.cephfs.data', 'cephfs.cephfs.meta']
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:27:13 compute-0 groupadd[186477]: group added to /etc/group: name=dnsmasq, GID=991
Dec 01 09:27:13 compute-0 groupadd[186477]: group added to /etc/gshadow: name=dnsmasq
Dec 01 09:27:13 compute-0 groupadd[186477]: new group: name=dnsmasq, GID=991
Dec 01 09:27:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v458: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:13 compute-0 useradd[186484]: new user: name=dnsmasq, UID=991, GID=991, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 01 09:27:13 compute-0 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec 01 09:27:13 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 01 09:27:13 compute-0 dbus-broker-launch[757]: Noticed file-system modification, trigger reload.
Dec 01 09:27:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:14 compute-0 groupadd[186497]: group added to /etc/group: name=clevis, GID=990
Dec 01 09:27:14 compute-0 groupadd[186497]: group added to /etc/gshadow: name=clevis
Dec 01 09:27:14 compute-0 groupadd[186497]: new group: name=clevis, GID=990
Dec 01 09:27:14 compute-0 useradd[186504]: new user: name=clevis, UID=990, GID=990, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 01 09:27:14 compute-0 usermod[186514]: add 'clevis' to group 'tss'
Dec 01 09:27:14 compute-0 usermod[186514]: add 'clevis' to shadow group 'tss'
Dec 01 09:27:14 compute-0 ceph-mon[75031]: pgmap v458: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v459: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:16 compute-0 ceph-mon[75031]: pgmap v459: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:16 compute-0 polkitd[43441]: Reloading rules
Dec 01 09:27:16 compute-0 polkitd[43441]: Collecting garbage unconditionally...
Dec 01 09:27:16 compute-0 polkitd[43441]: Loading rules from directory /etc/polkit-1/rules.d
Dec 01 09:27:16 compute-0 polkitd[43441]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 01 09:27:16 compute-0 polkitd[43441]: Finished loading, compiling and executing 3 rules
Dec 01 09:27:16 compute-0 polkitd[43441]: Reloading rules
Dec 01 09:27:16 compute-0 polkitd[43441]: Collecting garbage unconditionally...
Dec 01 09:27:16 compute-0 polkitd[43441]: Loading rules from directory /etc/polkit-1/rules.d
Dec 01 09:27:16 compute-0 polkitd[43441]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 01 09:27:16 compute-0 polkitd[43441]: Finished loading, compiling and executing 3 rules
Dec 01 09:27:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v460: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:17 compute-0 groupadd[186701]: group added to /etc/group: name=ceph, GID=167
Dec 01 09:27:17 compute-0 groupadd[186701]: group added to /etc/gshadow: name=ceph
Dec 01 09:27:17 compute-0 groupadd[186701]: new group: name=ceph, GID=167
Dec 01 09:27:17 compute-0 useradd[186707]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:27:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:27:18 compute-0 ceph-mon[75031]: pgmap v460: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:19 compute-0 podman[186714]: 2025-12-01 09:27:19.008758534 +0000 UTC m=+0.101705632 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 01 09:27:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v461: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:27:20.460 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:27:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:27:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:27:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:27:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:27:20 compute-0 ceph-mon[75031]: pgmap v461: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:20 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Dec 01 09:27:20 compute-0 sshd[1008]: Received signal 15; terminating.
Dec 01 09:27:20 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Dec 01 09:27:20 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Dec 01 09:27:20 compute-0 systemd[1]: sshd.service: Consumed 2.319s CPU time, read 32.0K from disk, written 0B to disk.
Dec 01 09:27:20 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Dec 01 09:27:20 compute-0 systemd[1]: Stopping sshd-keygen.target...
Dec 01 09:27:20 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 09:27:20 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 09:27:20 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 09:27:20 compute-0 systemd[1]: Reached target sshd-keygen.target.
Dec 01 09:27:20 compute-0 systemd[1]: Starting OpenSSH server daemon...
Dec 01 09:27:20 compute-0 sshd[187348]: Server listening on 0.0.0.0 port 22.
Dec 01 09:27:20 compute-0 systemd[1]: Started OpenSSH server daemon.
Dec 01 09:27:20 compute-0 sshd[187348]: Server listening on :: port 22.
Dec 01 09:27:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v462: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:22 compute-0 ceph-mon[75031]: pgmap v462: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:22 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:27:22 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:27:23 compute-0 systemd[1]: Reloading.
Dec 01 09:27:23 compute-0 systemd-rc-local-generator[187622]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:23 compute-0 systemd-sysv-generator[187628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:23 compute-0 podman[187581]: 2025-12-01 09:27:23.200093773 +0000 UTC m=+0.105952186 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:27:23 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:27:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v463: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:24 compute-0 ceph-mon[75031]: pgmap v463: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v464: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:25 compute-0 sudo[167203]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:26 compute-0 sudo[191827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zimgymjvqwqbunpsovwkndfjujnormqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581245.720275-336-200354897547224/AnsiballZ_systemd.py'
Dec 01 09:27:26 compute-0 sudo[191827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:26 compute-0 ceph-mon[75031]: pgmap v464: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:26 compute-0 python3.9[191850]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:27:26 compute-0 systemd[1]: Reloading.
Dec 01 09:27:27 compute-0 systemd-sysv-generator[192293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:27 compute-0 systemd-rc-local-generator[192288]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:27 compute-0 sudo[191827]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v465: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:27 compute-0 sudo[193213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehkorwdliovjhxsnwdxfbafiwdxoatsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581247.533184-336-29376226911161/AnsiballZ_systemd.py'
Dec 01 09:27:27 compute-0 sudo[193213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:28 compute-0 python3.9[193233]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:27:28 compute-0 systemd[1]: Reloading.
Dec 01 09:27:28 compute-0 systemd-sysv-generator[193706]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:28 compute-0 systemd-rc-local-generator[193699]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:28 compute-0 sudo[193213]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:28 compute-0 ceph-mon[75031]: pgmap v465: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:29 compute-0 sudo[194616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnllntnjqfiisvfeudtgpegjdlbxfnbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581248.590706-336-78412119249340/AnsiballZ_systemd.py'
Dec 01 09:27:29 compute-0 sudo[194616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:29 compute-0 python3.9[194639]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:27:29 compute-0 systemd[1]: Reloading.
Dec 01 09:27:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v466: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:29 compute-0 systemd-rc-local-generator[195135]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:29 compute-0 systemd-sysv-generator[195141]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:29 compute-0 sudo[194616]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:30 compute-0 sudo[195972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jknroohqwnlyydnjjojziqnagwykbfll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581249.8569984-336-153155092980243/AnsiballZ_systemd.py'
Dec 01 09:27:30 compute-0 sudo[195972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:30 compute-0 python3.9[195998]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:27:30 compute-0 systemd[1]: Reloading.
Dec 01 09:27:30 compute-0 systemd-rc-local-generator[196461]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:30 compute-0 systemd-sysv-generator[196467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:30 compute-0 sudo[195972]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:30 compute-0 ceph-mon[75031]: pgmap v466: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:31 compute-0 sudo[196788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrbuoxdgqbmlhufdvuffquhpokwqfxik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581250.998069-365-90179741510307/AnsiballZ_systemd.py'
Dec 01 09:27:31 compute-0 sudo[196788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v467: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:31 compute-0 python3.9[196790]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:31 compute-0 systemd[1]: Reloading.
Dec 01 09:27:31 compute-0 systemd-rc-local-generator[196862]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:31 compute-0 systemd-sysv-generator[196867]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:31 compute-0 sudo[196788]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:32 compute-0 sudo[197095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxmacjagvjiawbclxrllpxoksblbaswg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581252.118954-365-260538018766848/AnsiballZ_systemd.py'
Dec 01 09:27:32 compute-0 sudo[197095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:32 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:27:32 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:27:32 compute-0 systemd[1]: man-db-cache-update.service: Consumed 9.972s CPU time.
Dec 01 09:27:32 compute-0 systemd[1]: run-r9e0e46fc0b554e62bf094c5b2e064c83.service: Deactivated successfully.
Dec 01 09:27:32 compute-0 python3.9[197097]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:32 compute-0 systemd[1]: Reloading.
Dec 01 09:27:32 compute-0 systemd-rc-local-generator[197131]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:32 compute-0 systemd-sysv-generator[197135]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:33 compute-0 sudo[197095]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:33 compute-0 ceph-mon[75031]: pgmap v467: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v468: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:33 compute-0 sudo[197286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hngqzvbbliwgrcpgrzvhlhhynozdmbcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581253.355168-365-131303303233726/AnsiballZ_systemd.py'
Dec 01 09:27:33 compute-0 sudo[197286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:33 compute-0 python3.9[197288]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:34 compute-0 systemd[1]: Reloading.
Dec 01 09:27:34 compute-0 systemd-rc-local-generator[197317]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:34 compute-0 systemd-sysv-generator[197322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:34 compute-0 sudo[197286]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:34 compute-0 sudo[197476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouedhlaxbnatvmcdqefdpphndjevxixb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581254.4559164-365-207819645276229/AnsiballZ_systemd.py'
Dec 01 09:27:34 compute-0 sudo[197476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:35 compute-0 python3.9[197478]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:35 compute-0 sudo[197476]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:35 compute-0 ceph-mon[75031]: pgmap v468: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v469: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:35 compute-0 sudo[197631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apvxnyjloimvgysdumbgoxhyhjtqxixs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581255.2849514-365-54406428935277/AnsiballZ_systemd.py'
Dec 01 09:27:35 compute-0 sudo[197631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:35 compute-0 python3.9[197633]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:35 compute-0 systemd[1]: Reloading.
Dec 01 09:27:36 compute-0 systemd-rc-local-generator[197660]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:36 compute-0 systemd-sysv-generator[197665]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:36 compute-0 ceph-mon[75031]: pgmap v469: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:36 compute-0 sudo[197631]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:36 compute-0 sudo[197821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpbmeuzbeljmjwhsiltpjmuftgevvoua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581256.5123503-401-204265926785031/AnsiballZ_systemd.py'
Dec 01 09:27:36 compute-0 sudo[197821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:37 compute-0 python3.9[197823]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:27:37 compute-0 systemd[1]: Reloading.
Dec 01 09:27:37 compute-0 systemd-rc-local-generator[197854]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:27:37 compute-0 systemd-sysv-generator[197857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:27:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v470: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:37 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 01 09:27:37 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 01 09:27:37 compute-0 sudo[197821]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:38 compute-0 sudo[198014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gonapjvzhynacswmnmjmiykbiidjpxjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581257.769075-409-221371298314611/AnsiballZ_systemd.py'
Dec 01 09:27:38 compute-0 sudo[198014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:38 compute-0 python3.9[198016]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:38 compute-0 ceph-mon[75031]: pgmap v470: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:38 compute-0 sudo[198014]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:39 compute-0 sudo[198169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ictxaocgypwbalsuljvnqzyosnzenhwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581258.7942524-409-23266963404533/AnsiballZ_systemd.py'
Dec 01 09:27:39 compute-0 sudo[198169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v471: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:39 compute-0 python3.9[198171]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:39 compute-0 sudo[198169]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:39 compute-0 sudo[198324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxxollgynkohmzdojqauiastqczqfrnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581259.6383739-409-4810254954032/AnsiballZ_systemd.py'
Dec 01 09:27:39 compute-0 sudo[198324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:40 compute-0 python3.9[198326]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:40 compute-0 sudo[198324]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:40 compute-0 ceph-mon[75031]: pgmap v471: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:40 compute-0 sudo[198479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikkljzwryagzgzebkkixztxbahzfzady ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581260.521585-409-72598859443400/AnsiballZ_systemd.py'
Dec 01 09:27:40 compute-0 sudo[198479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:41 compute-0 python3.9[198481]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v472: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:42 compute-0 sudo[198479]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:42 compute-0 ceph-mon[75031]: pgmap v472: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:42 compute-0 sudo[198634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufrzmouvluetplrgezqcucfrcrrxannn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581262.3789847-409-123893464333517/AnsiballZ_systemd.py'
Dec 01 09:27:42 compute-0 sudo[198634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:42 compute-0 python3.9[198636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:27:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:27:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:27:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:27:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:27:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:27:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v473: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:44 compute-0 sudo[198634]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:44 compute-0 sudo[198789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wauytlapphzsbswevtffdpbfzorxwiow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581264.2240322-409-37564868203465/AnsiballZ_systemd.py'
Dec 01 09:27:44 compute-0 sudo[198789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:44 compute-0 ceph-mon[75031]: pgmap v473: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:44 compute-0 python3.9[198791]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:44 compute-0 sudo[198789]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:45 compute-0 sudo[198944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqfmyhxqxcrzqstuvzdbnafwnnytlanj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581265.0773842-409-55951463658109/AnsiballZ_systemd.py'
Dec 01 09:27:45 compute-0 sudo[198944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v474: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:45 compute-0 python3.9[198946]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:45 compute-0 sudo[198944]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:46 compute-0 sudo[199099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzetndzprisqikhoezkbzujlmfxnmrak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581265.9366107-409-163387810100453/AnsiballZ_systemd.py'
Dec 01 09:27:46 compute-0 sudo[199099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:46 compute-0 python3.9[199101]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:46 compute-0 ceph-mon[75031]: pgmap v474: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:46 compute-0 sudo[199099]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:47 compute-0 sudo[199254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-torefapkvjurkwmpexrnodljjpoptlag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581266.795882-409-85761802531721/AnsiballZ_systemd.py'
Dec 01 09:27:47 compute-0 sudo[199254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v475: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:47 compute-0 python3.9[199256]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:47 compute-0 sudo[199254]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:47 compute-0 sudo[199409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfqiiqlabntccqqwcnvxaiqoftwakmah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581267.5896747-409-57883085136670/AnsiballZ_systemd.py'
Dec 01 09:27:47 compute-0 sudo[199409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:48 compute-0 python3.9[199411]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:48 compute-0 sudo[199409]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:48 compute-0 sudo[199564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbtibngfkeskiwsuawpeehmayronxuzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581268.408531-409-84653298679239/AnsiballZ_systemd.py'
Dec 01 09:27:48 compute-0 sudo[199564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:49 compute-0 python3.9[199566]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:49 compute-0 ceph-mon[75031]: pgmap v475: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:49 compute-0 sudo[199564]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:49 compute-0 podman[199568]: 2025-12-01 09:27:49.225270495 +0000 UTC m=+0.132507704 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:27:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v476: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:49 compute-0 sudo[199744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afvflwsduyfxlqhewhhmwimmtdahlspx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581269.322739-409-103505812924195/AnsiballZ_systemd.py'
Dec 01 09:27:49 compute-0 sudo[199744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:49 compute-0 python3.9[199746]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:50 compute-0 sudo[199744]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:50 compute-0 sudo[199899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osojybzinsgmmfojcptbkcypkbgapabv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581270.1901507-409-151439452108170/AnsiballZ_systemd.py'
Dec 01 09:27:50 compute-0 sudo[199899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:50 compute-0 python3.9[199901]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:50 compute-0 sudo[199899]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:51 compute-0 ceph-mon[75031]: pgmap v476: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:51 compute-0 sudo[200054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwxhsmsjyjeprmvdjuizsxmmfyqxiqou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581271.0729618-409-105892649185430/AnsiballZ_systemd.py'
Dec 01 09:27:51 compute-0 sudo[200054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v477: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:51 compute-0 python3.9[200056]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 09:27:51 compute-0 sudo[200054]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:52 compute-0 sudo[200209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aixtapsjwmiwqevextuayakfrunmsalu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581272.1659446-511-164848909802411/AnsiballZ_file.py'
Dec 01 09:27:52 compute-0 sudo[200209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:52 compute-0 python3.9[200211]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:27:52 compute-0 sudo[200209]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:52 compute-0 auditd[702]: Audit daemon rotating log files
Dec 01 09:27:53 compute-0 ceph-mon[75031]: pgmap v477: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:53 compute-0 sudo[200361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aawabihknlkrmksjwqrrarmwlzzmqyjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581272.9138012-511-123322057322055/AnsiballZ_file.py'
Dec 01 09:27:53 compute-0 sudo[200361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v478: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:53 compute-0 python3.9[200363]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:27:53 compute-0 sudo[200361]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:53 compute-0 sudo[200527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ithmoeiahefgeksnziquyqocbynozsbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581273.591272-511-29080953311915/AnsiballZ_file.py'
Dec 01 09:27:53 compute-0 sudo[200527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:53 compute-0 podman[200487]: 2025-12-01 09:27:53.885217816 +0000 UTC m=+0.052366586 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 01 09:27:54 compute-0 python3.9[200535]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:27:54 compute-0 sudo[200527]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:54 compute-0 sudo[200685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etylmimybhvrpejgsdsslgethruimifh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581274.209123-511-36252628077498/AnsiballZ_file.py'
Dec 01 09:27:54 compute-0 sudo[200685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:54 compute-0 python3.9[200687]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:27:54 compute-0 sudo[200685]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:55 compute-0 ceph-mon[75031]: pgmap v478: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:55 compute-0 sudo[200837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nahgvqhmjvrvsrwbqesrpldoepyovqjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581274.8390648-511-78571096746252/AnsiballZ_file.py'
Dec 01 09:27:55 compute-0 sudo[200837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:55 compute-0 python3.9[200839]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:27:55 compute-0 sudo[200837]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v479: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:55 compute-0 sudo[200989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhifiikfewryyednibzirnltvatkyiss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581275.454672-511-239583146191328/AnsiballZ_file.py'
Dec 01 09:27:55 compute-0 sudo[200989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:55 compute-0 python3.9[200991]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:27:55 compute-0 sudo[200989]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:56 compute-0 sudo[201141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucoofnjvctpcvvuzvvsqbbsborgzccdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581276.1379938-554-175687639951767/AnsiballZ_stat.py'
Dec 01 09:27:56 compute-0 sudo[201141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:56 compute-0 python3.9[201143]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:27:56 compute-0 sudo[201141]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:57 compute-0 ceph-mon[75031]: pgmap v479: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v480: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:57 compute-0 sudo[201266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwvgcqsqxyutpydfbnmgysowodifuisq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581276.1379938-554-175687639951767/AnsiballZ_copy.py'
Dec 01 09:27:57 compute-0 sudo[201266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:57 compute-0 python3.9[201268]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581276.1379938-554-175687639951767/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:27:57 compute-0 sudo[201266]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:58 compute-0 sudo[201418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hposbjezmbhlnqbfsegswpekcgcdseit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581278.1158066-554-224690296856351/AnsiballZ_stat.py'
Dec 01 09:27:58 compute-0 sudo[201418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:58 compute-0 python3.9[201420]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:27:58 compute-0 sudo[201418]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:59 compute-0 sudo[201543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpmnkkztbrnfwjricctknpkajbvoeuwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581278.1158066-554-224690296856351/AnsiballZ_copy.py'
Dec 01 09:27:59 compute-0 sudo[201543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:27:59 compute-0 ceph-mon[75031]: pgmap v480: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:59 compute-0 python3.9[201545]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581278.1158066-554-224690296856351/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:27:59 compute-0 sudo[201543]: pam_unix(sudo:session): session closed for user root
Dec 01 09:27:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:27:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v481: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:27:59 compute-0 sudo[201695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fheirmklwwaqnzxqyvjgbmvwquthngqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581279.4553518-554-250676472119938/AnsiballZ_stat.py'
Dec 01 09:27:59 compute-0 sudo[201695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:00 compute-0 python3.9[201697]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:00 compute-0 sudo[201695]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:00 compute-0 sudo[201820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvcdonkeoyagsyzndrewdjduopfyacal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581279.4553518-554-250676472119938/AnsiballZ_copy.py'
Dec 01 09:28:00 compute-0 sudo[201820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:00 compute-0 python3.9[201822]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581279.4553518-554-250676472119938/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:00 compute-0 sudo[201820]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:01 compute-0 ceph-mon[75031]: pgmap v481: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:01 compute-0 sudo[201972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkbxlqzmwhbsyppelgqrucvxjlgvigdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581280.9123046-554-74411250698731/AnsiballZ_stat.py'
Dec 01 09:28:01 compute-0 sudo[201972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:01 compute-0 python3.9[201974]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v482: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:01 compute-0 sudo[201972]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:01 compute-0 sudo[202097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqfnptbnrqjmcdakpdtfuksahnbdwhpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581280.9123046-554-74411250698731/AnsiballZ_copy.py'
Dec 01 09:28:01 compute-0 sudo[202097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:01 compute-0 python3.9[202099]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581280.9123046-554-74411250698731/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:01 compute-0 sudo[202097]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:02 compute-0 sudo[202249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piksixbiiaavnnazoxnaqavwchdkkoxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581282.080971-554-32897695285985/AnsiballZ_stat.py'
Dec 01 09:28:02 compute-0 sudo[202249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:02 compute-0 python3.9[202251]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:02 compute-0 sudo[202249]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:03 compute-0 ceph-mon[75031]: pgmap v482: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:03 compute-0 sudo[202374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfsjoymgfuhyxompftcpugfvzcmzprxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581282.080971-554-32897695285985/AnsiballZ_copy.py'
Dec 01 09:28:03 compute-0 sudo[202374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:03 compute-0 python3.9[202376]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581282.080971-554-32897695285985/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:03 compute-0 sudo[202374]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v483: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:03 compute-0 sudo[202526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zksyrigumdchoqrcbcviczuutfaadtjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581283.4543128-554-235847549705753/AnsiballZ_stat.py'
Dec 01 09:28:03 compute-0 sudo[202526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:03 compute-0 python3.9[202528]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:04 compute-0 sudo[202526]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:04 compute-0 sudo[202651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itwaoqemtheuletncuuywugdgqcygjtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581283.4543128-554-235847549705753/AnsiballZ_copy.py'
Dec 01 09:28:04 compute-0 sudo[202651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:04 compute-0 python3.9[202653]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581283.4543128-554-235847549705753/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:04 compute-0 sudo[202651]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:05 compute-0 sudo[202803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooaoqasgpbqzvwzpijiiiluxjczmfnyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581284.744884-554-236708452641224/AnsiballZ_stat.py'
Dec 01 09:28:05 compute-0 sudo[202803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:05 compute-0 ceph-mon[75031]: pgmap v483: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:05 compute-0 python3.9[202805]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:05 compute-0 sudo[202803]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v484: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:05 compute-0 sudo[202926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulsjysmfcxizcvxhjrxwganpopfhwxrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581284.744884-554-236708452641224/AnsiballZ_copy.py'
Dec 01 09:28:05 compute-0 sudo[202926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:05 compute-0 python3.9[202928]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581284.744884-554-236708452641224/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:05 compute-0 sudo[202926]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:06 compute-0 sudo[203078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygnvjzncqqywybokmqszvtdilsaxsjxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581285.8917146-554-149804307004937/AnsiballZ_stat.py'
Dec 01 09:28:06 compute-0 sudo[203078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:06 compute-0 python3.9[203080]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:06 compute-0 sudo[203078]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:06 compute-0 sudo[203203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcrqhxdzbauexnkviwxvphtjpzjyasql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581285.8917146-554-149804307004937/AnsiballZ_copy.py'
Dec 01 09:28:06 compute-0 sudo[203203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:06 compute-0 python3.9[203205]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764581285.8917146-554-149804307004937/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:06 compute-0 sudo[203203]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:07 compute-0 ceph-mon[75031]: pgmap v484: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v485: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:07 compute-0 sudo[203355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeaydindisqmdtxdnssxysznjwvjcnbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581287.1870568-667-273245887968603/AnsiballZ_command.py'
Dec 01 09:28:07 compute-0 sudo[203355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:07 compute-0 python3.9[203357]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 01 09:28:07 compute-0 sudo[203355]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:08 compute-0 sudo[203508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwgnwssbdyhznnbngsjklmmkvgzgdyba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581287.9650738-676-123003998212823/AnsiballZ_file.py'
Dec 01 09:28:08 compute-0 sudo[203508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:08 compute-0 python3.9[203510]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:08 compute-0 sudo[203508]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:08 compute-0 sudo[203660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iovwwwdtloonrdnhawpgawabdcradobi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581288.6665668-676-28307629525383/AnsiballZ_file.py'
Dec 01 09:28:08 compute-0 sudo[203660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:09 compute-0 sudo[203663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:09 compute-0 sudo[203663]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:09 compute-0 sudo[203663]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 python3.9[203662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:09 compute-0 sudo[203660]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 sudo[203688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:28:09 compute-0 sudo[203688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:09 compute-0 sudo[203688]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 ceph-mon[75031]: pgmap v485: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:09 compute-0 sudo[203713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:09 compute-0 sudo[203713]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:09 compute-0 sudo[203713]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 sudo[203762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:28:09 compute-0 sudo[203762]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v486: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:09 compute-0 sudo[203929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evycgynsnqayhluuxzrfwrgcxkceyhgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581289.2809691-676-106330394536914/AnsiballZ_file.py'
Dec 01 09:28:09 compute-0 sudo[203929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:09 compute-0 sudo[203762]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:28:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:28:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:28:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:28:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:28:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:28:09 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 22bc7996-b20c-4bf2-a779-bbbb8b5ae20e does not exist
Dec 01 09:28:09 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 1b5ff66e-45e3-46d8-93d9-290c8001a41d does not exist
Dec 01 09:28:09 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 80aa6222-f655-4b0e-8a05-bd889f7d92db does not exist
Dec 01 09:28:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:28:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:28:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:28:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:28:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:28:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:28:09 compute-0 python3.9[203931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:09 compute-0 sudo[203929]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 sudo[203946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:09 compute-0 sudo[203946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:09 compute-0 sudo[203946]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 sudo[203976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:28:09 compute-0 sudo[203976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:09 compute-0 sudo[203976]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:09 compute-0 sudo[204022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:09 compute-0 sudo[204022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:09 compute-0 sudo[204022]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:10 compute-0 sudo[204075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:28:10 compute-0 sudo[204075]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:28:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:28:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:28:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:28:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:28:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:28:10 compute-0 sudo[204202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeyxxkupbdunnuocmrdeamnffcoomlmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581289.975333-676-26624669455529/AnsiballZ_file.py'
Dec 01 09:28:10 compute-0 sudo[204202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:10 compute-0 podman[204238]: 2025-12-01 09:28:10.385916537 +0000 UTC m=+0.035851348 container create fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:28:10 compute-0 python3.9[204210]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:10 compute-0 systemd[1]: Started libpod-conmon-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope.
Dec 01 09:28:10 compute-0 sudo[204202]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:10 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:28:10 compute-0 podman[204238]: 2025-12-01 09:28:10.369163832 +0000 UTC m=+0.019098673 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:28:10 compute-0 podman[204238]: 2025-12-01 09:28:10.47592106 +0000 UTC m=+0.125855891 container init fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:28:10 compute-0 podman[204238]: 2025-12-01 09:28:10.484487298 +0000 UTC m=+0.134422109 container start fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:28:10 compute-0 podman[204238]: 2025-12-01 09:28:10.487715781 +0000 UTC m=+0.137650592 container attach fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:28:10 compute-0 systemd[1]: libpod-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope: Deactivated successfully.
Dec 01 09:28:10 compute-0 suspicious_bose[204254]: 167 167
Dec 01 09:28:10 compute-0 podman[204238]: 2025-12-01 09:28:10.493629852 +0000 UTC m=+0.143564663 container died fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec 01 09:28:10 compute-0 conmon[204254]: conmon fe8d54eb3a7af097d222 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope/container/memory.events
Dec 01 09:28:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-49f40e1095c32e4bdb0ccda9fd970639d402e948b46fb239af8b947b9d848b6d-merged.mount: Deactivated successfully.
Dec 01 09:28:10 compute-0 podman[204238]: 2025-12-01 09:28:10.540375444 +0000 UTC m=+0.190310255 container remove fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_bose, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:28:10 compute-0 systemd[1]: libpod-conmon-fe8d54eb3a7af097d2227fb2f90bb6682c3e05799e7d8f110b6f1556090f5310.scope: Deactivated successfully.
Dec 01 09:28:10 compute-0 podman[204354]: 2025-12-01 09:28:10.700763153 +0000 UTC m=+0.046434524 container create d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:28:10 compute-0 systemd[1]: Started libpod-conmon-d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1.scope.
Dec 01 09:28:10 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:28:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:10 compute-0 podman[204354]: 2025-12-01 09:28:10.677085068 +0000 UTC m=+0.022756479 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:28:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:10 compute-0 podman[204354]: 2025-12-01 09:28:10.782508888 +0000 UTC m=+0.128180279 container init d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:28:10 compute-0 podman[204354]: 2025-12-01 09:28:10.791785426 +0000 UTC m=+0.137456817 container start d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:28:10 compute-0 podman[204354]: 2025-12-01 09:28:10.795210695 +0000 UTC m=+0.140882076 container attach d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:28:10 compute-0 sudo[204448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwephnyvfqdnzzlyiavrrzomergrcxeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581290.57876-676-24776701177645/AnsiballZ_file.py'
Dec 01 09:28:10 compute-0 sudo[204448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:11 compute-0 python3.9[204450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:11 compute-0 sudo[204448]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:11 compute-0 ceph-mon[75031]: pgmap v486: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v487: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:11 compute-0 sudo[204601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwfvtznsiljsvxglttzvchmiudjiletc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581291.22261-676-179694366603095/AnsiballZ_file.py'
Dec 01 09:28:11 compute-0 sudo[204601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:11 compute-0 python3.9[204605]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:11 compute-0 sudo[204601]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:11 compute-0 sweet_jennings[204393]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:28:11 compute-0 sweet_jennings[204393]: --> relative data size: 1.0
Dec 01 09:28:11 compute-0 sweet_jennings[204393]: --> All data devices are unavailable
Dec 01 09:28:11 compute-0 systemd[1]: libpod-d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1.scope: Deactivated successfully.
Dec 01 09:28:11 compute-0 podman[204354]: 2025-12-01 09:28:11.813099675 +0000 UTC m=+1.158771106 container died d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:28:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f80134766c796e2390f7bcac3fad2168cd516a6c3d418c3f3e08ace97809e94-merged.mount: Deactivated successfully.
Dec 01 09:28:11 compute-0 podman[204354]: 2025-12-01 09:28:11.878030463 +0000 UTC m=+1.223701824 container remove d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:28:11 compute-0 systemd[1]: libpod-conmon-d67e58a9657e48d909e836c5c97ac01c60ffb769e228a6ff4c608bb88d7d49b1.scope: Deactivated successfully.
Dec 01 09:28:11 compute-0 sudo[204075]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:11 compute-0 sudo[204717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:11 compute-0 sudo[204717]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:11 compute-0 sudo[204717]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:12 compute-0 sudo[204764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:28:12 compute-0 sudo[204764]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:12 compute-0 sudo[204764]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:12 compute-0 sudo[204813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:12 compute-0 sudo[204813]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:12 compute-0 sudo[204813]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:12 compute-0 sudo[204863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukvqjotvrsvuyffpbqzyriktixgqrhez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581291.8311925-676-274549022473722/AnsiballZ_file.py'
Dec 01 09:28:12 compute-0 sudo[204863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:12 compute-0 sudo[204865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:28:12 compute-0 sudo[204865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:12 compute-0 python3.9[204870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:12 compute-0 sudo[204863]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:12 compute-0 podman[204982]: 2025-12-01 09:28:12.495112522 +0000 UTC m=+0.039012090 container create 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:28:12 compute-0 systemd[1]: Started libpod-conmon-29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0.scope.
Dec 01 09:28:12 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:28:12 compute-0 podman[204982]: 2025-12-01 09:28:12.572894181 +0000 UTC m=+0.116793769 container init 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:28:12 compute-0 podman[204982]: 2025-12-01 09:28:12.478874542 +0000 UTC m=+0.022774130 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:28:12 compute-0 podman[204982]: 2025-12-01 09:28:12.581995335 +0000 UTC m=+0.125894903 container start 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:28:12 compute-0 podman[204982]: 2025-12-01 09:28:12.585183577 +0000 UTC m=+0.129083145 container attach 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:28:12 compute-0 determined_shirley[205035]: 167 167
Dec 01 09:28:12 compute-0 systemd[1]: libpod-29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0.scope: Deactivated successfully.
Dec 01 09:28:12 compute-0 podman[204982]: 2025-12-01 09:28:12.586897856 +0000 UTC m=+0.130797424 container died 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:28:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-be7ad9939f75d794f82579afd3618c586362b2647d5e8aed93676927d3927f83-merged.mount: Deactivated successfully.
Dec 01 09:28:12 compute-0 podman[204982]: 2025-12-01 09:28:12.621356303 +0000 UTC m=+0.165255871 container remove 29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_shirley, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:28:12 compute-0 systemd[1]: libpod-conmon-29460f64bfb8a02d396006eb58259100fc887e4537d1fe9b6772fe478fc1f5b0.scope: Deactivated successfully.
Dec 01 09:28:12 compute-0 sudo[205120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqlumjriqxadnwsbhmetfwrovaygfqfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581292.438076-676-33169375794608/AnsiballZ_file.py'
Dec 01 09:28:12 compute-0 sudo[205120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:12 compute-0 podman[205128]: 2025-12-01 09:28:12.810686449 +0000 UTC m=+0.071158169 container create 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default)
Dec 01 09:28:12 compute-0 systemd[1]: Started libpod-conmon-4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145.scope.
Dec 01 09:28:12 compute-0 podman[205128]: 2025-12-01 09:28:12.762957799 +0000 UTC m=+0.023429549 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:28:12 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:28:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:12 compute-0 python3.9[205122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:12 compute-0 sudo[205120]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:12 compute-0 podman[205128]: 2025-12-01 09:28:12.954087797 +0000 UTC m=+0.214559547 container init 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:28:12 compute-0 podman[205128]: 2025-12-01 09:28:12.960366789 +0000 UTC m=+0.220838519 container start 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:28:12 compute-0 podman[205128]: 2025-12-01 09:28:12.981361626 +0000 UTC m=+0.241833386 container attach 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:28:13
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'vms', '.mgr', 'images', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'volumes']
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:28:13 compute-0 ceph-mon[75031]: pgmap v487: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v488: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:13 compute-0 sudo[205299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rondiognmitbjvykijngffpsvyeupjln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581293.0429292-676-47095315030212/AnsiballZ_file.py'
Dec 01 09:28:13 compute-0 sudo[205299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]: {
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:     "0": [
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:         {
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "devices": [
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "/dev/loop3"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             ],
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_name": "ceph_lv0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_size": "21470642176",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "name": "ceph_lv0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "tags": {
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cluster_name": "ceph",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.crush_device_class": "",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.encrypted": "0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osd_id": "0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.type": "block",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.vdo": "0"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             },
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "type": "block",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "vg_name": "ceph_vg0"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:         }
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:     ],
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:     "1": [
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:         {
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "devices": [
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "/dev/loop4"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             ],
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_name": "ceph_lv1",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_size": "21470642176",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "name": "ceph_lv1",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "tags": {
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cluster_name": "ceph",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.crush_device_class": "",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.encrypted": "0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osd_id": "1",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.type": "block",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.vdo": "0"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             },
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "type": "block",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "vg_name": "ceph_vg1"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:         }
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:     ],
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:     "2": [
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:         {
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "devices": [
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "/dev/loop5"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             ],
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_name": "ceph_lv2",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_size": "21470642176",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "name": "ceph_lv2",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "tags": {
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.cluster_name": "ceph",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.crush_device_class": "",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.encrypted": "0",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osd_id": "2",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.type": "block",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:                 "ceph.vdo": "0"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             },
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "type": "block",
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:             "vg_name": "ceph_vg2"
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:         }
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]:     ]
Dec 01 09:28:13 compute-0 laughing_cartwright[205145]: }
Dec 01 09:28:13 compute-0 python3.9[205301]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:13 compute-0 systemd[1]: libpod-4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145.scope: Deactivated successfully.
Dec 01 09:28:13 compute-0 podman[205128]: 2025-12-01 09:28:13.768224835 +0000 UTC m=+1.028696585 container died 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:28:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-243094111e362cac9d172b3a21290e1c1a99794435ba016dd7ad7bd2ae2432f2-merged.mount: Deactivated successfully.
Dec 01 09:28:13 compute-0 sudo[205299]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:13 compute-0 podman[205128]: 2025-12-01 09:28:13.833321048 +0000 UTC m=+1.093792778 container remove 4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_cartwright, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:28:13 compute-0 systemd[1]: libpod-conmon-4b2447b5967cfa25ff9171c5621633099e4c3de5d003a924dca6a33fe4dd0145.scope: Deactivated successfully.
Dec 01 09:28:13 compute-0 sudo[204865]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:13 compute-0 sudo[205340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:13 compute-0 sudo[205340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:13 compute-0 sudo[205340]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:13 compute-0 ceph-mgr[75324]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3312476512
Dec 01 09:28:14 compute-0 sudo[205388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:28:14 compute-0 sudo[205388]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:14 compute-0 sudo[205388]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:14 compute-0 sudo[205442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:14 compute-0 sudo[205442]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:14 compute-0 sudo[205442]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:14 compute-0 sudo[205490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:28:14 compute-0 sudo[205490]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:14 compute-0 sudo[205565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-compgfderlwdzxlzdssiahhfvhtqnubj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581293.9495325-676-185015546095155/AnsiballZ_file.py'
Dec 01 09:28:14 compute-0 sudo[205565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:14 compute-0 python3.9[205567]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:14 compute-0 sudo[205565]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:14 compute-0 podman[205607]: 2025-12-01 09:28:14.469672864 +0000 UTC m=+0.041749399 container create 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:28:14 compute-0 systemd[1]: Started libpod-conmon-2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651.scope.
Dec 01 09:28:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:28:14 compute-0 podman[205607]: 2025-12-01 09:28:14.452579059 +0000 UTC m=+0.024655594 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:28:14 compute-0 podman[205607]: 2025-12-01 09:28:14.54941879 +0000 UTC m=+0.121495345 container init 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:28:14 compute-0 podman[205607]: 2025-12-01 09:28:14.557583136 +0000 UTC m=+0.129659661 container start 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec 01 09:28:14 compute-0 podman[205607]: 2025-12-01 09:28:14.560947204 +0000 UTC m=+0.133023729 container attach 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:28:14 compute-0 infallible_elion[205646]: 167 167
Dec 01 09:28:14 compute-0 systemd[1]: libpod-2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651.scope: Deactivated successfully.
Dec 01 09:28:14 compute-0 podman[205607]: 2025-12-01 09:28:14.562678564 +0000 UTC m=+0.134755089 container died 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 09:28:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-2c38ad948cdde6b23e2c9aeff377c33e8ce699284ae4a169ec8937ba05f6be81-merged.mount: Deactivated successfully.
Dec 01 09:28:14 compute-0 podman[205607]: 2025-12-01 09:28:14.600792336 +0000 UTC m=+0.172868861 container remove 2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_elion, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:28:14 compute-0 systemd[1]: libpod-conmon-2d92853788f450fb8b4ef8a204ad90ade140038c88a7a688597ba05d9e350651.scope: Deactivated successfully.
Dec 01 09:28:14 compute-0 podman[205745]: 2025-12-01 09:28:14.769474225 +0000 UTC m=+0.041106800 container create a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:28:14 compute-0 systemd[1]: Started libpod-conmon-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope.
Dec 01 09:28:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:28:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:28:14 compute-0 sudo[205814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sarvnpslapbaopyivpkwlggowrzaersx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581294.5804136-676-239138545537113/AnsiballZ_file.py'
Dec 01 09:28:14 compute-0 sudo[205814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:14 compute-0 podman[205745]: 2025-12-01 09:28:14.841666812 +0000 UTC m=+0.113299377 container init a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:28:14 compute-0 podman[205745]: 2025-12-01 09:28:14.750689032 +0000 UTC m=+0.022321607 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:28:14 compute-0 podman[205745]: 2025-12-01 09:28:14.852188426 +0000 UTC m=+0.123820991 container start a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:28:14 compute-0 podman[205745]: 2025-12-01 09:28:14.855481512 +0000 UTC m=+0.127114087 container attach a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:28:15 compute-0 python3.9[205816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:15 compute-0 sudo[205814]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:15 compute-0 ceph-mon[75031]: pgmap v488: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v489: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:15 compute-0 sudo[205968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yenqbohaqduwpugvvyjxmlaorwhrugez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581295.1961553-676-40996256188317/AnsiballZ_file.py'
Dec 01 09:28:15 compute-0 sudo[205968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:15 compute-0 python3.9[205970]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:15 compute-0 sudo[205968]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]: {
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "osd_id": 0,
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "type": "bluestore"
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:     },
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "osd_id": 1,
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "type": "bluestore"
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:     },
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "osd_id": 2,
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:         "type": "bluestore"
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]:     }
Dec 01 09:28:15 compute-0 quirky_ishizaka[205805]: }
Dec 01 09:28:15 compute-0 systemd[1]: libpod-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope: Deactivated successfully.
Dec 01 09:28:15 compute-0 podman[205745]: 2025-12-01 09:28:15.901085565 +0000 UTC m=+1.172718110 container died a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:28:15 compute-0 systemd[1]: libpod-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope: Consumed 1.053s CPU time.
Dec 01 09:28:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-e36358037fbbb4de1271a434b2d15e138974d5f43c9b941f13b7524c32f92610-merged.mount: Deactivated successfully.
Dec 01 09:28:15 compute-0 podman[205745]: 2025-12-01 09:28:15.954136669 +0000 UTC m=+1.225769224 container remove a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_ishizaka, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:28:15 compute-0 systemd[1]: libpod-conmon-a7e7211f7ee85ee82da7d67d8b1866df0312cae5f394e1f57a9a03641869e11c.scope: Deactivated successfully.
Dec 01 09:28:15 compute-0 sudo[205490]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:28:15 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:28:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:28:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:28:16 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 103817a1-9080-4f6d-9587-583b9e2f4d5e does not exist
Dec 01 09:28:16 compute-0 sudo[206084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:28:16 compute-0 sudo[206084]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:16 compute-0 sudo[206084]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:16 compute-0 sudo[206118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:28:16 compute-0 sudo[206118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:28:16 compute-0 sudo[206118]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:16 compute-0 sudo[206210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cifgueklcgfajmfyodpocwiiwaugpcii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581295.9408474-676-250961245088896/AnsiballZ_file.py'
Dec 01 09:28:16 compute-0 sudo[206210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:16 compute-0 python3.9[206212]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:16 compute-0 sudo[206210]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:16 compute-0 sudo[206362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeqvolygndwvtqctivsvklgihehbhqro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581296.663786-676-142263593879771/AnsiballZ_file.py'
Dec 01 09:28:16 compute-0 sudo[206362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:16 compute-0 ceph-mon[75031]: pgmap v489: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:28:16 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:28:17 compute-0 python3.9[206364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:17 compute-0 sudo[206362]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v490: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:17 compute-0 sudo[206514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhanqmbrkqjfipmrehzabtriqfwzuwkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581297.3584514-775-150555474384716/AnsiballZ_stat.py'
Dec 01 09:28:17 compute-0 sudo[206514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:17 compute-0 python3.9[206516]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:17 compute-0 sudo[206514]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:18 compute-0 sudo[206637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keawyhrslblawrmtfqfyyjjgpradkxwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581297.3584514-775-150555474384716/AnsiballZ_copy.py'
Dec 01 09:28:18 compute-0 sudo[206637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:18 compute-0 python3.9[206639]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581297.3584514-775-150555474384716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:18 compute-0 sudo[206637]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:28:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:28:18 compute-0 sudo[206789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmlstkbsrhuphazfimmryarpfzunkmif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581298.6639555-775-41504929997351/AnsiballZ_stat.py'
Dec 01 09:28:18 compute-0 sudo[206789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:19 compute-0 ceph-mon[75031]: pgmap v490: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:19 compute-0 python3.9[206791]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:19 compute-0 sudo[206789]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v491: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:19 compute-0 sudo[206921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nphzbxcqllbttopwksslvfcmhithwqul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581298.6639555-775-41504929997351/AnsiballZ_copy.py'
Dec 01 09:28:19 compute-0 sudo[206921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:19 compute-0 podman[206886]: 2025-12-01 09:28:19.585402168 +0000 UTC m=+0.101512137 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:28:19 compute-0 python3.9[206930]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581298.6639555-775-41504929997351/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:19 compute-0 sudo[206921]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:20 compute-0 sudo[207089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvvypvpqjixiuidyssqktpzzbiwlcnbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581299.845651-775-69273951457665/AnsiballZ_stat.py'
Dec 01 09:28:20 compute-0 sudo[207089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:20 compute-0 python3.9[207091]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:20 compute-0 sudo[207089]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:28:20.461 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:28:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:28:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:28:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:28:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:28:20 compute-0 sudo[207212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twqpihwfxykjxsngyzyjmrrdczjkxpuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581299.845651-775-69273951457665/AnsiballZ_copy.py'
Dec 01 09:28:20 compute-0 sudo[207212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:20 compute-0 python3.9[207214]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581299.845651-775-69273951457665/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:20 compute-0 sudo[207212]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:21 compute-0 ceph-mon[75031]: pgmap v491: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:21 compute-0 sudo[207364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoxglkvfexrperykffhiulxhaoratimi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581300.9906926-775-103812361140376/AnsiballZ_stat.py'
Dec 01 09:28:21 compute-0 sudo[207364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v492: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:21 compute-0 python3.9[207366]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:21 compute-0 sudo[207364]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:21 compute-0 sudo[207487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umyrjbftewfmmtifzuqsiztamepygoyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581300.9906926-775-103812361140376/AnsiballZ_copy.py'
Dec 01 09:28:21 compute-0 sudo[207487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:21 compute-0 python3.9[207489]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581300.9906926-775-103812361140376/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:22 compute-0 sudo[207487]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:22 compute-0 sudo[207639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjspyqpuqxktqkjakveflbplojpkazfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581302.183005-775-163393557397729/AnsiballZ_stat.py'
Dec 01 09:28:22 compute-0 sudo[207639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:22 compute-0 python3.9[207641]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:22 compute-0 sudo[207639]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:22 compute-0 sudo[207762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phbrldkmfvpiopbvqnqwvdidczyjwdfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581302.183005-775-163393557397729/AnsiballZ_copy.py'
Dec 01 09:28:22 compute-0 sudo[207762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:23 compute-0 ceph-mon[75031]: pgmap v492: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:23 compute-0 python3.9[207764]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581302.183005-775-163393557397729/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:23 compute-0 sudo[207762]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v493: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:23 compute-0 sudo[207914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpggtpwdyoqsztotfodxnlitevkcvcbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581303.4017506-775-93537649270581/AnsiballZ_stat.py'
Dec 01 09:28:23 compute-0 sudo[207914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:24 compute-0 python3.9[207916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:24 compute-0 sudo[207914]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:24 compute-0 sudo[208054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mefadfxdsclhuzclttcxxmxxqwaxpabb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581303.4017506-775-93537649270581/AnsiballZ_copy.py'
Dec 01 09:28:24 compute-0 sudo[208054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:24 compute-0 podman[208011]: 2025-12-01 09:28:24.40818252 +0000 UTC m=+0.050627575 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 01 09:28:24 compute-0 python3.9[208058]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581303.4017506-775-93537649270581/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:24 compute-0 sudo[208054]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:24 compute-0 sudo[208208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alfwyshwdxwdrtihpupkmknmchzgelmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581304.7469008-775-160116931691371/AnsiballZ_stat.py'
Dec 01 09:28:24 compute-0 sudo[208208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:25 compute-0 ceph-mon[75031]: pgmap v493: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:25 compute-0 python3.9[208210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:25 compute-0 sudo[208208]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v494: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:25 compute-0 sudo[208331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsmdeqwkfbtxwutkujjgwmtugyprboni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581304.7469008-775-160116931691371/AnsiballZ_copy.py'
Dec 01 09:28:25 compute-0 sudo[208331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:25 compute-0 python3.9[208333]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581304.7469008-775-160116931691371/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:25 compute-0 sudo[208331]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:26 compute-0 sudo[208483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqahcndzrxxzczwpasjnnulwbqrlbkpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581305.9030778-775-91481268059870/AnsiballZ_stat.py'
Dec 01 09:28:26 compute-0 sudo[208483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:26 compute-0 python3.9[208485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:26 compute-0 sudo[208483]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:26 compute-0 sudo[208606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxbhykbhhalogsusxatirqblfclcnxsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581305.9030778-775-91481268059870/AnsiballZ_copy.py'
Dec 01 09:28:26 compute-0 sudo[208606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:26 compute-0 python3.9[208608]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581305.9030778-775-91481268059870/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:26 compute-0 sudo[208606]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:27 compute-0 ceph-mon[75031]: pgmap v494: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:27 compute-0 sudo[208758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjvtkdmhueqpkdtdsgoroihpekghsgyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581307.0551562-775-110121608971846/AnsiballZ_stat.py'
Dec 01 09:28:27 compute-0 sudo[208758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v495: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:27 compute-0 python3.9[208760]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:27 compute-0 sudo[208758]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:27 compute-0 sudo[208881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgpjeyowgpdxtvwtpzowodawnkmjwjlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581307.0551562-775-110121608971846/AnsiballZ_copy.py'
Dec 01 09:28:27 compute-0 sudo[208881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:27 compute-0 python3.9[208883]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581307.0551562-775-110121608971846/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:28 compute-0 sudo[208881]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:28 compute-0 sudo[209033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmndfbtvqdlxvqrbbuhcmwkfpcikfjnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581308.1291695-775-78100187418647/AnsiballZ_stat.py'
Dec 01 09:28:28 compute-0 sudo[209033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:28 compute-0 python3.9[209035]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:28 compute-0 sudo[209033]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:29 compute-0 ceph-mon[75031]: pgmap v495: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:29 compute-0 sudo[209156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zovoulnzqaijkdajlvmlxjpemkrotwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581308.1291695-775-78100187418647/AnsiballZ_copy.py'
Dec 01 09:28:29 compute-0 sudo[209156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:29 compute-0 python3.9[209158]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581308.1291695-775-78100187418647/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:29 compute-0 sudo[209156]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v496: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:29 compute-0 sudo[209308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toueccjckoqmymcgerkfomuikglhutyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581309.463916-775-43498553648316/AnsiballZ_stat.py'
Dec 01 09:28:29 compute-0 sudo[209308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:29 compute-0 python3.9[209310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:29 compute-0 sudo[209308]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:30 compute-0 sudo[209431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbhenvlmvlndnxbmualhulycqdbpmkjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581309.463916-775-43498553648316/AnsiballZ_copy.py'
Dec 01 09:28:30 compute-0 sudo[209431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:30 compute-0 python3.9[209433]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581309.463916-775-43498553648316/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:30 compute-0 sudo[209431]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:30 compute-0 sudo[209583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfxjgakfwjyvwsbmpqxxoonzutndbboh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581310.7062364-775-198040237461578/AnsiballZ_stat.py'
Dec 01 09:28:30 compute-0 sudo[209583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:31 compute-0 ceph-mon[75031]: pgmap v496: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:31 compute-0 python3.9[209585]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:31 compute-0 sudo[209583]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v497: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:31 compute-0 sudo[209706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyqffgqngwjqbzpwjiltzvgcjusioxbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581310.7062364-775-198040237461578/AnsiballZ_copy.py'
Dec 01 09:28:31 compute-0 sudo[209706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:31 compute-0 python3.9[209708]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581310.7062364-775-198040237461578/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:31 compute-0 sudo[209706]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:32 compute-0 sudo[209858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wahtjmnwxcuxivsrmqkbpqnaiyyncobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581311.9012904-775-181873014793249/AnsiballZ_stat.py'
Dec 01 09:28:32 compute-0 sudo[209858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:32 compute-0 python3.9[209860]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:32 compute-0 sudo[209858]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:32 compute-0 sudo[209981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldbdzowfmmidhejyjvznqvbdtlyounpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581311.9012904-775-181873014793249/AnsiballZ_copy.py'
Dec 01 09:28:32 compute-0 sudo[209981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:32 compute-0 python3.9[209983]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581311.9012904-775-181873014793249/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:32 compute-0 sudo[209981]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:33 compute-0 ceph-mon[75031]: pgmap v497: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v498: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:33 compute-0 sudo[210133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgzvtreyqxezzhsyudaggmzbxbplcqpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581313.0578291-775-181381597770655/AnsiballZ_stat.py'
Dec 01 09:28:33 compute-0 sudo[210133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:33 compute-0 python3.9[210135]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:33 compute-0 sudo[210133]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:33 compute-0 sudo[210256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivdynkxlzjkjbszljmvrtoybqpvqqacu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581313.0578291-775-181381597770655/AnsiballZ_copy.py'
Dec 01 09:28:33 compute-0 sudo[210256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:34 compute-0 python3.9[210258]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581313.0578291-775-181381597770655/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:34 compute-0 sudo[210256]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:34 compute-0 python3.9[210408]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:28:35 compute-0 ceph-mon[75031]: pgmap v498: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v499: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:35 compute-0 sudo[210561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylvfapewgqnzexuthgwwnvctycuuiobb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581315.063853-981-116931464334383/AnsiballZ_seboolean.py'
Dec 01 09:28:35 compute-0 sudo[210561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:35 compute-0 python3.9[210563]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 01 09:28:37 compute-0 sudo[210561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:37 compute-0 ceph-mon[75031]: pgmap v499: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v500: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:37 compute-0 sudo[210717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnmkxlgmipqckpwlaiwtodhobjhjjebj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581317.2506433-989-257528633720338/AnsiballZ_copy.py'
Dec 01 09:28:37 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 01 09:28:37 compute-0 sudo[210717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:37 compute-0 python3.9[210719]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:37 compute-0 sudo[210717]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:38 compute-0 sudo[210869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alzlaauvgawfnlwconfuafgypwkawudf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581318.1690824-989-21910559784901/AnsiballZ_copy.py'
Dec 01 09:28:38 compute-0 sudo[210869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:38 compute-0 ceph-mon[75031]: pgmap v500: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:38 compute-0 python3.9[210871]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:38 compute-0 sudo[210869]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:39 compute-0 sudo[211021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmilenllcnbwgluijizyowkgvvtkazgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581318.887303-989-158248417285659/AnsiballZ_copy.py'
Dec 01 09:28:39 compute-0 sudo[211021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:39 compute-0 python3.9[211023]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:39 compute-0 sudo[211021]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v501: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:39 compute-0 sudo[211173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzsqsovvmhxuiktwzsgvsabgazocubnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581319.521201-989-18630529276592/AnsiballZ_copy.py'
Dec 01 09:28:39 compute-0 sudo[211173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:39 compute-0 python3.9[211175]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:40 compute-0 sudo[211173]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:40 compute-0 sudo[211325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsmjaunpghublmwcyyqyhctqseyweiyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581320.1398833-989-256240996525484/AnsiballZ_copy.py'
Dec 01 09:28:40 compute-0 sudo[211325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:40 compute-0 ceph-mon[75031]: pgmap v501: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:40 compute-0 python3.9[211327]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:40 compute-0 sudo[211325]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:41 compute-0 sudo[211477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqcmsgaqldyizspfvtdywkwgzddcyerk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581320.7782557-1025-258105036208649/AnsiballZ_copy.py'
Dec 01 09:28:41 compute-0 sudo[211477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:41 compute-0 python3.9[211479]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:41 compute-0 sudo[211477]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v502: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:41 compute-0 sudo[211629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usdmjgetmwtwzcdvpiigfwoxjmudisnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581321.4191906-1025-135951594274563/AnsiballZ_copy.py'
Dec 01 09:28:41 compute-0 sudo[211629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:41 compute-0 python3.9[211631]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:41 compute-0 sudo[211629]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:42 compute-0 sudo[211781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kndsbmpdwocwpiwzglcripbkxynfcupf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581322.0549622-1025-183023134129125/AnsiballZ_copy.py'
Dec 01 09:28:42 compute-0 sudo[211781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:42 compute-0 ceph-mon[75031]: pgmap v502: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:42 compute-0 python3.9[211783]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:42 compute-0 sudo[211781]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:42 compute-0 sudo[211933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txctlqjcgyaalrchvfhfliqowgblavtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581322.703929-1025-245273698489428/AnsiballZ_copy.py'
Dec 01 09:28:42 compute-0 sudo[211933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:28:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:28:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:28:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:28:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:28:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:28:43 compute-0 python3.9[211935]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:43 compute-0 sudo[211933]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v503: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:43 compute-0 sudo[212085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adytpujwogfwnbddqzxjbgpugbhaiolk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581323.3312442-1025-118792602057388/AnsiballZ_copy.py'
Dec 01 09:28:43 compute-0 sudo[212085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:43 compute-0 python3.9[212087]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:43 compute-0 sudo[212085]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:44 compute-0 sudo[212237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pheiqludionaclycusyuyjnnjmekxoiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581324.0393124-1061-139495297786561/AnsiballZ_systemd.py'
Dec 01 09:28:44 compute-0 sudo[212237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:44 compute-0 python3.9[212239]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:28:44 compute-0 systemd[1]: Reloading.
Dec 01 09:28:44 compute-0 ceph-mon[75031]: pgmap v503: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:44 compute-0 systemd-rc-local-generator[212263]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:28:44 compute-0 systemd-sysv-generator[212266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #27. Immutable memtables: 0.
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.793724) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 27
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324793781, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2035, "num_deletes": 251, "total_data_size": 2345115, "memory_usage": 2391208, "flush_reason": "Manual Compaction"}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #28: started
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324809155, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 28, "file_size": 2274390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8982, "largest_seqno": 11016, "table_properties": {"data_size": 2265213, "index_size": 5799, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 17692, "raw_average_key_size": 19, "raw_value_size": 2246934, "raw_average_value_size": 2466, "num_data_blocks": 267, "num_entries": 911, "num_filter_entries": 911, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581091, "oldest_key_time": 1764581091, "file_creation_time": 1764581324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 28, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 15473 microseconds, and 6717 cpu microseconds.
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.809213) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #28: 2274390 bytes OK
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.809238) [db/memtable_list.cc:519] [default] Level-0 commit table #28 started
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.810768) [db/memtable_list.cc:722] [default] Level-0 commit table #28: memtable #1 done
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.810783) EVENT_LOG_v1 {"time_micros": 1764581324810779, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.810801) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2336643, prev total WAL file size 2336643, number of live WAL files 2.
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000024.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.812559) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [28(2221KB)], [26(4517KB)]
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324812594, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [28], "files_L6": [26], "score": -1, "input_data_size": 6900475, "oldest_snapshot_seqno": -1}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #29: 3183 keys, 5798191 bytes, temperature: kUnknown
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324851388, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 29, "file_size": 5798191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5772889, "index_size": 16233, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 73570, "raw_average_key_size": 23, "raw_value_size": 5711862, "raw_average_value_size": 1794, "num_data_blocks": 718, "num_entries": 3183, "num_filter_entries": 3183, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.851665) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 5798191 bytes
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.853026) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.5 rd, 149.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 4.4 +0.0 blob) out(5.5 +0.0 blob), read-write-amplify(5.6) write-amplify(2.5) OK, records in: 3697, records dropped: 514 output_compression: NoCompression
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.853048) EVENT_LOG_v1 {"time_micros": 1764581324853038, "job": 10, "event": "compaction_finished", "compaction_time_micros": 38876, "compaction_time_cpu_micros": 15183, "output_level": 6, "num_output_files": 1, "total_output_size": 5798191, "num_input_records": 3697, "num_output_records": 3183, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000028.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324853735, "job": 10, "event": "table_file_deletion", "file_number": 28}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581324854953, "job": 10, "event": "table_file_deletion", "file_number": 26}
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.812420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855141) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855144) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:28:44 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:28:44.855146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:28:45 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Dec 01 09:28:45 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Dec 01 09:28:45 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 01 09:28:45 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 01 09:28:45 compute-0 systemd[1]: Starting libvirt logging daemon...
Dec 01 09:28:45 compute-0 systemd[1]: Started libvirt logging daemon.
Dec 01 09:28:45 compute-0 sudo[212237]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v504: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:45 compute-0 sudo[212431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igrzvfhvqwmuuldtsrpdtqcfrbuysmgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581325.376928-1061-82171952947231/AnsiballZ_systemd.py'
Dec 01 09:28:45 compute-0 sudo[212431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:45 compute-0 python3.9[212433]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:28:46 compute-0 systemd[1]: Reloading.
Dec 01 09:28:46 compute-0 systemd-rc-local-generator[212458]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:28:46 compute-0 systemd-sysv-generator[212462]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:28:46 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 01 09:28:46 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 01 09:28:46 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 01 09:28:46 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 01 09:28:46 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 01 09:28:46 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 01 09:28:46 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 01 09:28:46 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 01 09:28:46 compute-0 sudo[212431]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:46 compute-0 ceph-mon[75031]: pgmap v504: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:46 compute-0 sudo[212647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qizeldqfaatctgbyvndlrmclefqewkqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581326.5836246-1061-161495408489927/AnsiballZ_systemd.py'
Dec 01 09:28:46 compute-0 sudo[212647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:47 compute-0 python3.9[212649]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:28:47 compute-0 systemd[1]: Reloading.
Dec 01 09:28:47 compute-0 systemd-rc-local-generator[212674]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:28:47 compute-0 systemd-sysv-generator[212678]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:28:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v505: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:47 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 01 09:28:47 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 01 09:28:47 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 01 09:28:47 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 01 09:28:47 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 01 09:28:47 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 01 09:28:47 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 01 09:28:47 compute-0 sudo[212647]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:47 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 01 09:28:47 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 01 09:28:48 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 01 09:28:48 compute-0 sudo[212865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmhuyyxhxnrfbdnkogzoqtquatctxnzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581327.7859924-1061-61316133553774/AnsiballZ_systemd.py'
Dec 01 09:28:48 compute-0 sudo[212865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:48 compute-0 python3.9[212867]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:28:48 compute-0 systemd[1]: Reloading.
Dec 01 09:28:48 compute-0 systemd-rc-local-generator[212896]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:28:48 compute-0 systemd-sysv-generator[212900]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:28:48 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Dec 01 09:28:48 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 01 09:28:48 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 01 09:28:48 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 01 09:28:48 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 01 09:28:48 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 01 09:28:48 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 01 09:28:48 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 01 09:28:48 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 01 09:28:48 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 01 09:28:48 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 01 09:28:48 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 01 09:28:48 compute-0 ceph-mon[75031]: pgmap v505: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:48 compute-0 sudo[212865]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:48 compute-0 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1e55f597-2f82-4df1-b992-f49fa7fcf036
Dec 01 09:28:48 compute-0 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 01 09:28:48 compute-0 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1e55f597-2f82-4df1-b992-f49fa7fcf036
Dec 01 09:28:48 compute-0 setroubleshoot[212685]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 01 09:28:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:49 compute-0 sudo[213082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqwpvitybrrzxastwwyzjgkmhpwdvlyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581328.9987907-1061-135609691220931/AnsiballZ_systemd.py'
Dec 01 09:28:49 compute-0 sudo[213082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v506: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:49 compute-0 python3.9[213084]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:28:49 compute-0 systemd[1]: Reloading.
Dec 01 09:28:49 compute-0 systemd-rc-local-generator[213128]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:28:49 compute-0 systemd-sysv-generator[213133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:28:49 compute-0 podman[213086]: 2025-12-01 09:28:49.842913836 +0000 UTC m=+0.118459138 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:28:50 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Dec 01 09:28:50 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Dec 01 09:28:50 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 01 09:28:50 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 01 09:28:50 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 01 09:28:50 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 01 09:28:50 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 01 09:28:50 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 01 09:28:50 compute-0 sudo[213082]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:50 compute-0 sudo[213320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erdcmvuzyharchdkmcsgxeqymvfixgbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581330.4140632-1098-111516222343345/AnsiballZ_file.py'
Dec 01 09:28:50 compute-0 sudo[213320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:50 compute-0 ceph-mon[75031]: pgmap v506: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:50 compute-0 python3.9[213322]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:50 compute-0 sudo[213320]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:51 compute-0 sudo[213472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbbeyuqhypaodmkupchrgwitqxjmuosc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581331.0784726-1106-215895808423079/AnsiballZ_find.py'
Dec 01 09:28:51 compute-0 sudo[213472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v507: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:51 compute-0 python3.9[213474]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:28:51 compute-0 sudo[213472]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:52 compute-0 sudo[213624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxbpxouriyvhdvhavtczohawpcrfjafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581331.7727544-1114-272082706301027/AnsiballZ_command.py'
Dec 01 09:28:52 compute-0 sudo[213624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:52 compute-0 python3.9[213626]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:28:52 compute-0 sudo[213624]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:52 compute-0 ceph-mon[75031]: pgmap v507: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:52 compute-0 python3.9[213780]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:28:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v508: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:53 compute-0 python3.9[213930]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:28:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:54 compute-0 python3.9[214051]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581333.3169858-1133-99746231312967/.source.xml follow=False _original_basename=secret.xml.j2 checksum=972bce57f1b968e3bea30025319af4764744aa0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:54 compute-0 sudo[214212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhtxkjpfsvmoeiqqiiowzlmmvekucbyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581334.5470934-1148-249505246030017/AnsiballZ_command.py'
Dec 01 09:28:54 compute-0 sudo[214212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:54 compute-0 podman[214175]: 2025-12-01 09:28:54.842206052 +0000 UTC m=+0.050975575 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 01 09:28:55 compute-0 python3.9[214220]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 5620a9fb-e540-5250-a0e8-7aaad5347e3b
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:28:55 compute-0 polkitd[43441]: Registered Authentication Agent for unix-process:214222:317999 (system bus name :1.2772 [pkttyagent --process 214222 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 01 09:28:55 compute-0 polkitd[43441]: Unregistered Authentication Agent for unix-process:214222:317999 (system bus name :1.2772, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 01 09:28:55 compute-0 polkitd[43441]: Registered Authentication Agent for unix-process:214221:317998 (system bus name :1.2773 [pkttyagent --process 214221 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 01 09:28:55 compute-0 polkitd[43441]: Unregistered Authentication Agent for unix-process:214221:317998 (system bus name :1.2773, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 01 09:28:55 compute-0 ceph-mon[75031]: pgmap v508: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:55 compute-0 sudo[214212]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v509: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:55 compute-0 python3.9[214382]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:56 compute-0 sudo[214532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qacwuyfeuzsplhexzzhqgxwmwoxeyboh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581335.9278057-1164-37220808488467/AnsiballZ_command.py'
Dec 01 09:28:56 compute-0 sudo[214532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:56 compute-0 sudo[214532]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:56 compute-0 sudo[214685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyjjmlsgajqnpnyklpnovznqqqdygwvq ; FSID=5620a9fb-e540-5250-a0e8-7aaad5347e3b KEY=AQDWWy1pAAAAABAA0JvObGCkXGU+EEwqsvh/8w== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581336.6147208-1172-45028606557949/AnsiballZ_command.py'
Dec 01 09:28:56 compute-0 sudo[214685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:57 compute-0 polkitd[43441]: Registered Authentication Agent for unix-process:214688:318205 (system bus name :1.2776 [pkttyagent --process 214688 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 01 09:28:57 compute-0 polkitd[43441]: Unregistered Authentication Agent for unix-process:214688:318205 (system bus name :1.2776, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 01 09:28:57 compute-0 sudo[214685]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:57 compute-0 ceph-mon[75031]: pgmap v509: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v510: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:57 compute-0 sudo[214843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhdupuhqcxjjfqnrjxtzbuzubpdfiinf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581337.314746-1180-182215625252349/AnsiballZ_copy.py'
Dec 01 09:28:57 compute-0 sudo[214843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:57 compute-0 python3.9[214845]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:28:57 compute-0 sudo[214843]: pam_unix(sudo:session): session closed for user root
Dec 01 09:28:59 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 01 09:28:59 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 01 09:28:59 compute-0 ceph-mon[75031]: pgmap v510: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:28:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v511: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:28:59 compute-0 sudo[214995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwvoerfdsdsojfjelwgrcvnzwtyerask ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581337.9535317-1188-28803400859215/AnsiballZ_stat.py'
Dec 01 09:28:59 compute-0 sudo[214995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:28:59 compute-0 python3.9[214997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:00 compute-0 sudo[214995]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:00 compute-0 sudo[215118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnvarxtmokvukxxvpgrdsxiltkvillag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581337.9535317-1188-28803400859215/AnsiballZ_copy.py'
Dec 01 09:29:00 compute-0 sudo[215118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:00 compute-0 python3.9[215120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581337.9535317-1188-28803400859215/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:00 compute-0 sudo[215118]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:01 compute-0 sudo[215270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giirdfvwvxpxshegxwksxckaaiwbkqbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581340.8772695-1204-64166948757237/AnsiballZ_file.py'
Dec 01 09:29:01 compute-0 sudo[215270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:01 compute-0 ceph-mon[75031]: pgmap v511: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:01 compute-0 python3.9[215272]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:01 compute-0 sudo[215270]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v512: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:01 compute-0 sudo[215422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnamtleqsohqjesqvrjbgcfnosykgaew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581341.538494-1212-133243831194082/AnsiballZ_stat.py'
Dec 01 09:29:01 compute-0 sudo[215422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:02 compute-0 python3.9[215424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:02 compute-0 sudo[215422]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:02 compute-0 sudo[215500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgpgjoqxlkgrcdceocybfxpeeotmireg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581341.538494-1212-133243831194082/AnsiballZ_file.py'
Dec 01 09:29:02 compute-0 sudo[215500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:02 compute-0 python3.9[215502]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:02 compute-0 sudo[215500]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:02 compute-0 sudo[215652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpsmgfqgjbhszsfoofdycxntgafkxyrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581342.6534083-1224-78415637233624/AnsiballZ_stat.py'
Dec 01 09:29:02 compute-0 sudo[215652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:03 compute-0 python3.9[215654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:03 compute-0 sudo[215652]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:03 compute-0 ceph-mon[75031]: pgmap v512: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:03 compute-0 sudo[215730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipmosfoobjdebdgyozkluandxhoibovi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581342.6534083-1224-78415637233624/AnsiballZ_file.py'
Dec 01 09:29:03 compute-0 sudo[215730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v513: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:03 compute-0 python3.9[215732]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.287xdcm9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:03 compute-0 sudo[215730]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:04 compute-0 sudo[215882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfjzolfsvuqfuyjgvfowyfswswuplbut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581343.7808375-1236-169107001821598/AnsiballZ_stat.py'
Dec 01 09:29:04 compute-0 sudo[215882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:04 compute-0 python3.9[215884]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:04 compute-0 sudo[215882]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:04 compute-0 ceph-mon[75031]: pgmap v513: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:04 compute-0 sudo[215960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhhxcvwkfcgvaymfvvxmhgcecxmyrmca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581343.7808375-1236-169107001821598/AnsiballZ_file.py'
Dec 01 09:29:04 compute-0 sudo[215960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:04 compute-0 python3.9[215962]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:04 compute-0 sudo[215960]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:05 compute-0 sudo[216112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loensgguutihzkbwrcahejszbzlwpmbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581344.9774282-1249-254636612384746/AnsiballZ_command.py'
Dec 01 09:29:05 compute-0 sudo[216112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v514: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:05 compute-0 python3.9[216114]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:29:05 compute-0 sudo[216112]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:06 compute-0 sudo[216265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdwbovtyupqxomapsedyznlltbarvdei ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764581345.670555-1257-29524043154482/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 09:29:06 compute-0 sudo[216265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:06 compute-0 ceph-mon[75031]: pgmap v514: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:06 compute-0 python3[216267]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 09:29:06 compute-0 sudo[216265]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:07 compute-0 sudo[216417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgnlgeigmtgfpysrgwlvvzafewgienae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581346.681085-1265-253918861272046/AnsiballZ_stat.py'
Dec 01 09:29:07 compute-0 sudo[216417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:07 compute-0 python3.9[216419]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:07 compute-0 sudo[216417]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v515: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:07 compute-0 sudo[216495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otbpomytouxwlncrrtlbsxvrscjybbtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581346.681085-1265-253918861272046/AnsiballZ_file.py'
Dec 01 09:29:07 compute-0 sudo[216495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:07 compute-0 python3.9[216497]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:07 compute-0 sudo[216495]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:08 compute-0 sudo[216647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erhzqprzydqtuaitktteaawtdbaxcdga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581347.933272-1277-87497231560620/AnsiballZ_stat.py'
Dec 01 09:29:08 compute-0 sudo[216647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:08 compute-0 python3.9[216649]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:08 compute-0 sudo[216647]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:08 compute-0 ceph-mon[75031]: pgmap v515: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:08 compute-0 sudo[216725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywpljvvciekreeuaoupwxcqnczbnfzjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581347.933272-1277-87497231560620/AnsiballZ_file.py'
Dec 01 09:29:08 compute-0 sudo[216725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:08 compute-0 python3.9[216727]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:08 compute-0 sudo[216725]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:09 compute-0 sudo[216877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sinehsbvlhuntlvdczukixdeknlgfanr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581349.0877268-1289-37901948111869/AnsiballZ_stat.py'
Dec 01 09:29:09 compute-0 sudo[216877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v516: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:09 compute-0 python3.9[216879]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:09 compute-0 sudo[216877]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:09 compute-0 sudo[216955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amkfjsbcmxlxbcqugqeizpzcyznaltfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581349.0877268-1289-37901948111869/AnsiballZ_file.py'
Dec 01 09:29:09 compute-0 sudo[216955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:10 compute-0 python3.9[216957]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:10 compute-0 sudo[216955]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:10 compute-0 ceph-mon[75031]: pgmap v516: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:10 compute-0 sudo[217107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auqpfidfbuwxqlgwttcowmwcvkhgnfvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581350.2767282-1301-199578667533939/AnsiballZ_stat.py'
Dec 01 09:29:10 compute-0 sudo[217107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:10 compute-0 python3.9[217109]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:10 compute-0 sudo[217107]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:11 compute-0 sudo[217185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofcqseviqtyfohwjpisyfkbumsvdwmqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581350.2767282-1301-199578667533939/AnsiballZ_file.py'
Dec 01 09:29:11 compute-0 sudo[217185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:11 compute-0 python3.9[217187]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:11 compute-0 sudo[217185]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v517: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:11 compute-0 sudo[217337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efmmxtzugvpefmnardmybpueqgwyszst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581351.4179072-1313-134781417579801/AnsiballZ_stat.py'
Dec 01 09:29:11 compute-0 sudo[217337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:11 compute-0 python3.9[217339]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:11 compute-0 sudo[217337]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:12 compute-0 sudo[217462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krpvwmonzufbssmyqrwcxwlfcdptxoli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581351.4179072-1313-134781417579801/AnsiballZ_copy.py'
Dec 01 09:29:12 compute-0 sudo[217462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:12 compute-0 ceph-mon[75031]: pgmap v517: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:12 compute-0 python3.9[217464]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764581351.4179072-1313-134781417579801/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:12 compute-0 sudo[217462]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:12 compute-0 sudo[217614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zustmgbfqjnvlxglqjfrqnydichegxxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581352.6875632-1328-174227074598726/AnsiballZ_file.py'
Dec 01 09:29:12 compute-0 sudo[217614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:29:13
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['vms', 'volumes', 'cephfs.cephfs.data', 'images', '.mgr', 'backups', 'cephfs.cephfs.meta']
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:29:13 compute-0 python3.9[217616]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:29:13 compute-0 sudo[217614]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:29:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v518: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:13 compute-0 sudo[217766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyhipmohutyaxbtrwowxufyvioktyezl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581353.3176234-1336-45463867369698/AnsiballZ_command.py'
Dec 01 09:29:13 compute-0 sudo[217766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:13 compute-0 python3.9[217768]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:29:13 compute-0 sudo[217766]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:14 compute-0 sudo[217921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpfnnhzmvnmsyvwrwzbfcjfjzkxkdcez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581353.992396-1344-8970580052592/AnsiballZ_blockinfile.py'
Dec 01 09:29:14 compute-0 sudo[217921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:14 compute-0 ceph-mon[75031]: pgmap v518: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:14 compute-0 python3.9[217923]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:14 compute-0 sudo[217921]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:15 compute-0 sudo[218073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvdbbjzaajzbmmnbbcvwajzcjbytanqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581354.8889797-1353-106376431922415/AnsiballZ_command.py'
Dec 01 09:29:15 compute-0 sudo[218073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:15 compute-0 python3.9[218075]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:29:15 compute-0 sudo[218073]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v519: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:15 compute-0 sudo[218226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyphedkxslqrlrvrlwzrdemygcmqtlnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581355.5556147-1361-45574497117088/AnsiballZ_stat.py'
Dec 01 09:29:15 compute-0 sudo[218226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:16 compute-0 python3.9[218228]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:29:16 compute-0 sudo[218226]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 sudo[218255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:16 compute-0 sudo[218255]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:16 compute-0 sudo[218255]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 sudo[218303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:29:16 compute-0 sudo[218303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:16 compute-0 sudo[218303]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 sudo[218357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:16 compute-0 sudo[218357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:16 compute-0 sudo[218357]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 sudo[218403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:29:16 compute-0 sudo[218403]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:16 compute-0 sudo[218480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdhfsjlbsitziskqsjovgjhzjowldjmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581356.1824653-1369-77572417006134/AnsiballZ_command.py'
Dec 01 09:29:16 compute-0 sudo[218480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:16 compute-0 ceph-mon[75031]: pgmap v519: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:16 compute-0 python3.9[218482]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:29:16 compute-0 sudo[218480]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 sudo[218403]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:29:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:29:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:29:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:29:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:29:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:29:16 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 56aff331-8ab8-4dd8-a9b0-f5a9d59847b8 does not exist
Dec 01 09:29:16 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev a19cb7a8-3e7d-4a95-b2a4-e975d2443207 does not exist
Dec 01 09:29:16 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 96e88a56-2d0e-42e8-be95-674d6adfda93 does not exist
Dec 01 09:29:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:29:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:29:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:29:16 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:29:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:29:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:29:16 compute-0 sudo[218545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:16 compute-0 sudo[218545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:16 compute-0 sudo[218545]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 sudo[218598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:29:16 compute-0 sudo[218598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:16 compute-0 sudo[218598]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:16 compute-0 sudo[218643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:16 compute-0 sudo[218643]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:16 compute-0 sudo[218643]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:17 compute-0 sudo[218691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:29:17 compute-0 sudo[218691]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:17 compute-0 sudo[218766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueslvszloremxhcmcrowjavzkpzxmolt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581356.8482928-1377-38399286012014/AnsiballZ_file.py'
Dec 01 09:29:17 compute-0 sudo[218766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:17 compute-0 python3.9[218768]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:17 compute-0 sudo[218766]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:17 compute-0 podman[218809]: 2025-12-01 09:29:17.347555627 +0000 UTC m=+0.047503435 container create e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:29:17 compute-0 systemd[1]: Started libpod-conmon-e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587.scope.
Dec 01 09:29:17 compute-0 podman[218809]: 2025-12-01 09:29:17.329195266 +0000 UTC m=+0.029143094 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:29:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v520: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:29:17 compute-0 podman[218809]: 2025-12-01 09:29:17.443132872 +0000 UTC m=+0.143080710 container init e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:29:17 compute-0 podman[218809]: 2025-12-01 09:29:17.451454033 +0000 UTC m=+0.151401851 container start e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:29:17 compute-0 podman[218809]: 2025-12-01 09:29:17.455352765 +0000 UTC m=+0.155300613 container attach e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:29:17 compute-0 tender_curie[218850]: 167 167
Dec 01 09:29:17 compute-0 systemd[1]: libpod-e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587.scope: Deactivated successfully.
Dec 01 09:29:17 compute-0 podman[218809]: 2025-12-01 09:29:17.457839817 +0000 UTC m=+0.157787625 container died e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:29:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-aba983264edde1c34908a495a9f031eac5688654f1e505d1aaf74f11d7b72f2b-merged.mount: Deactivated successfully.
Dec 01 09:29:17 compute-0 podman[218809]: 2025-12-01 09:29:17.49493174 +0000 UTC m=+0.194879548 container remove e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_curie, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:29:17 compute-0 systemd[1]: libpod-conmon-e557ee0218ba771c604af17aaee3251177788dcf17ba73d74feb7f2e9ea38587.scope: Deactivated successfully.
Dec 01 09:29:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:29:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:29:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:29:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:29:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:29:17 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:29:17 compute-0 podman[218939]: 2025-12-01 09:29:17.661356554 +0000 UTC m=+0.041444630 container create 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:29:17 compute-0 systemd[1]: Started libpod-conmon-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope.
Dec 01 09:29:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:29:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:17 compute-0 podman[218939]: 2025-12-01 09:29:17.735006824 +0000 UTC m=+0.115094920 container init 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:29:17 compute-0 podman[218939]: 2025-12-01 09:29:17.644345972 +0000 UTC m=+0.024434048 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:29:17 compute-0 podman[218939]: 2025-12-01 09:29:17.747034822 +0000 UTC m=+0.127122908 container start 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:29:17 compute-0 podman[218939]: 2025-12-01 09:29:17.75076071 +0000 UTC m=+0.130848786 container attach 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:29:17 compute-0 sudo[219020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkflwinsjgwbuozkrvuvlqosrohqpbfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581357.5100846-1385-209251408862537/AnsiballZ_stat.py'
Dec 01 09:29:17 compute-0 sudo[219020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:17 compute-0 python3.9[219022]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:17 compute-0 sudo[219020]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:18 compute-0 sudo[219143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrxafbbnmhbonvkqfggbskvkftvitezp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581357.5100846-1385-209251408862537/AnsiballZ_copy.py'
Dec 01 09:29:18 compute-0 sudo[219143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:29:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:29:18 compute-0 python3.9[219145]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581357.5100846-1385-209251408862537/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:18 compute-0 sudo[219143]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:18 compute-0 intelligent_jennings[218989]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:29:18 compute-0 intelligent_jennings[218989]: --> relative data size: 1.0
Dec 01 09:29:18 compute-0 intelligent_jennings[218989]: --> All data devices are unavailable
Dec 01 09:29:19 compute-0 systemd[1]: libpod-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope: Deactivated successfully.
Dec 01 09:29:19 compute-0 podman[218939]: 2025-12-01 09:29:19.010922928 +0000 UTC m=+1.391011004 container died 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:29:19 compute-0 systemd[1]: libpod-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope: Consumed 1.141s CPU time.
Dec 01 09:29:19 compute-0 sudo[219320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stythnvellzmjlxqtxabrshyuqqomimo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581358.7448077-1400-265868212999446/AnsiballZ_stat.py'
Dec 01 09:29:19 compute-0 sudo[219320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:19 compute-0 ceph-mon[75031]: pgmap v520: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:19 compute-0 python3.9[219331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:19 compute-0 sudo[219320]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-cc20b70bfeb344c8050da1f33b07de8ec7ef6f4c82709c28dfe62118018a5927-merged.mount: Deactivated successfully.
Dec 01 09:29:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:19 compute-0 podman[218939]: 2025-12-01 09:29:19.335196627 +0000 UTC m=+1.715284713 container remove 61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_jennings, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:29:19 compute-0 sudo[218691]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:19 compute-0 systemd[1]: libpod-conmon-61c3b09089ed951be00db0eee55be80eb4d46a164f07929e6a01470018be571e.scope: Deactivated successfully.
Dec 01 09:29:19 compute-0 sudo[219380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v521: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:19 compute-0 sudo[219380]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:19 compute-0 sudo[219380]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:19 compute-0 sudo[219428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:29:19 compute-0 sudo[219428]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:19 compute-0 sudo[219428]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:19 compute-0 sudo[219477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:19 compute-0 sudo[219477]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:19 compute-0 sudo[219477]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:19 compute-0 sudo[219527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkkrtomziusxybzwrdovczwfztdiwzcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581358.7448077-1400-265868212999446/AnsiballZ_copy.py'
Dec 01 09:29:19 compute-0 sudo[219527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:19 compute-0 sudo[219529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:29:19 compute-0 sudo[219529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:19 compute-0 python3.9[219536]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581358.7448077-1400-265868212999446/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:19 compute-0 sudo[219527]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:19 compute-0 podman[219620]: 2025-12-01 09:29:19.901186267 +0000 UTC m=+0.029518114 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:29:19 compute-0 podman[219620]: 2025-12-01 09:29:19.994778874 +0000 UTC m=+0.123110701 container create be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:29:20 compute-0 systemd[1]: Started libpod-conmon-be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8.scope.
Dec 01 09:29:20 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:29:20 compute-0 podman[219620]: 2025-12-01 09:29:20.08451456 +0000 UTC m=+0.212846407 container init be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:29:20 compute-0 podman[219620]: 2025-12-01 09:29:20.092363657 +0000 UTC m=+0.220695484 container start be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:29:20 compute-0 interesting_tharp[219689]: 167 167
Dec 01 09:29:20 compute-0 systemd[1]: libpod-be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8.scope: Deactivated successfully.
Dec 01 09:29:20 compute-0 podman[219620]: 2025-12-01 09:29:20.162135205 +0000 UTC m=+0.290467032 container attach be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:29:20 compute-0 podman[219620]: 2025-12-01 09:29:20.163402362 +0000 UTC m=+0.291734209 container died be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:29:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-1b7e60088397a842e833bdd0c25bd91d1a5e276e5a6a8eb47b2f705dbb9b39e5-merged.mount: Deactivated successfully.
Dec 01 09:29:20 compute-0 podman[219620]: 2025-12-01 09:29:20.228538156 +0000 UTC m=+0.356869983 container remove be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_tharp, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:29:20 compute-0 systemd[1]: libpod-conmon-be022af12d249bf87dcbde0a7d9efbfd86973b286f1b7450407c65346f82ecf8.scope: Deactivated successfully.
Dec 01 09:29:20 compute-0 sudo[219800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwclpgshbrphstqqnjbeufmkteqljdjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581359.9236312-1415-91334037796122/AnsiballZ_stat.py'
Dec 01 09:29:20 compute-0 sudo[219800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:20 compute-0 podman[219690]: 2025-12-01 09:29:20.312166294 +0000 UTC m=+0.237717386 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 01 09:29:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:29:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:29:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:29:20.462 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:29:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:29:20.463 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:29:20 compute-0 python3.9[219805]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:20 compute-0 podman[219817]: 2025-12-01 09:29:20.392767886 +0000 UTC m=+0.025918511 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:29:20 compute-0 podman[219817]: 2025-12-01 09:29:20.493362465 +0000 UTC m=+0.126513080 container create 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:29:20 compute-0 sudo[219800]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:20 compute-0 systemd[1]: Started libpod-conmon-76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74.scope.
Dec 01 09:29:20 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:29:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:20 compute-0 podman[219817]: 2025-12-01 09:29:20.700155367 +0000 UTC m=+0.333305972 container init 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:29:20 compute-0 podman[219817]: 2025-12-01 09:29:20.707960752 +0000 UTC m=+0.341111327 container start 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:29:20 compute-0 podman[219817]: 2025-12-01 09:29:20.719949309 +0000 UTC m=+0.353099904 container attach 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:29:20 compute-0 sudo[219959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shhnzvykoycjflwvqfoajqjvhaxhjgdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581359.9236312-1415-91334037796122/AnsiballZ_copy.py'
Dec 01 09:29:20 compute-0 sudo[219959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:21 compute-0 python3.9[219961]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581359.9236312-1415-91334037796122/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:21 compute-0 sudo[219959]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:21 compute-0 ceph-mon[75031]: pgmap v521: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v522: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:21 compute-0 strange_beaver[219841]: {
Dec 01 09:29:21 compute-0 strange_beaver[219841]:     "0": [
Dec 01 09:29:21 compute-0 strange_beaver[219841]:         {
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "devices": [
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "/dev/loop3"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             ],
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_name": "ceph_lv0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_size": "21470642176",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "name": "ceph_lv0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "tags": {
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cluster_name": "ceph",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.crush_device_class": "",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.encrypted": "0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osd_id": "0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.type": "block",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.vdo": "0"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             },
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "type": "block",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "vg_name": "ceph_vg0"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:         }
Dec 01 09:29:21 compute-0 strange_beaver[219841]:     ],
Dec 01 09:29:21 compute-0 strange_beaver[219841]:     "1": [
Dec 01 09:29:21 compute-0 strange_beaver[219841]:         {
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "devices": [
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "/dev/loop4"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             ],
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_name": "ceph_lv1",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_size": "21470642176",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "name": "ceph_lv1",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "tags": {
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cluster_name": "ceph",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.crush_device_class": "",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.encrypted": "0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osd_id": "1",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.type": "block",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.vdo": "0"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             },
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "type": "block",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "vg_name": "ceph_vg1"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:         }
Dec 01 09:29:21 compute-0 strange_beaver[219841]:     ],
Dec 01 09:29:21 compute-0 strange_beaver[219841]:     "2": [
Dec 01 09:29:21 compute-0 strange_beaver[219841]:         {
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "devices": [
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "/dev/loop5"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             ],
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_name": "ceph_lv2",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_size": "21470642176",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "name": "ceph_lv2",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "tags": {
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.cluster_name": "ceph",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.crush_device_class": "",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.encrypted": "0",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osd_id": "2",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.type": "block",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:                 "ceph.vdo": "0"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             },
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "type": "block",
Dec 01 09:29:21 compute-0 strange_beaver[219841]:             "vg_name": "ceph_vg2"
Dec 01 09:29:21 compute-0 strange_beaver[219841]:         }
Dec 01 09:29:21 compute-0 strange_beaver[219841]:     ]
Dec 01 09:29:21 compute-0 strange_beaver[219841]: }
Dec 01 09:29:21 compute-0 systemd[1]: libpod-76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74.scope: Deactivated successfully.
Dec 01 09:29:21 compute-0 podman[219817]: 2025-12-01 09:29:21.562177039 +0000 UTC m=+1.195327615 container died 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:29:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-53f8cdfa72f7e468458f2834eac30a4bf8983ddd5fd1574bcde880ae35e2bbb0-merged.mount: Deactivated successfully.
Dec 01 09:29:21 compute-0 podman[219817]: 2025-12-01 09:29:21.781534284 +0000 UTC m=+1.414684879 container remove 76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_beaver, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:29:21 compute-0 systemd[1]: libpod-conmon-76437cf4edca6e754e568c2ed2504ef753e11638884e3663d63affb91d680e74.scope: Deactivated successfully.
Dec 01 09:29:21 compute-0 sudo[219529]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:21 compute-0 sudo[220129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-digxpqlycxdudsnhkamphdrnvpdlkxby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581361.305922-1430-37980994820192/AnsiballZ_systemd.py'
Dec 01 09:29:21 compute-0 sudo[220129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:21 compute-0 sudo[220130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:21 compute-0 sudo[220130]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:21 compute-0 sudo[220130]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:21 compute-0 sudo[220157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:29:21 compute-0 sudo[220157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:21 compute-0 sudo[220157]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:21 compute-0 sudo[220182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:21 compute-0 sudo[220182]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:21 compute-0 sudo[220182]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:22 compute-0 sudo[220207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:29:22 compute-0 sudo[220207]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:22 compute-0 python3.9[220144]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:29:22 compute-0 systemd[1]: Reloading.
Dec 01 09:29:22 compute-0 systemd-rc-local-generator[220283]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:29:22 compute-0 systemd-sysv-generator[220289]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:29:22 compute-0 podman[220306]: 2025-12-01 09:29:22.384539885 +0000 UTC m=+0.040291516 container create 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:29:22 compute-0 podman[220306]: 2025-12-01 09:29:22.365819724 +0000 UTC m=+0.021571375 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:29:22 compute-0 systemd[1]: Started libpod-conmon-49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a.scope.
Dec 01 09:29:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:29:22 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Dec 01 09:29:22 compute-0 sudo[220129]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:22 compute-0 podman[220306]: 2025-12-01 09:29:22.712655696 +0000 UTC m=+0.368407367 container init 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:29:22 compute-0 podman[220306]: 2025-12-01 09:29:22.720670728 +0000 UTC m=+0.376422359 container start 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:29:22 compute-0 sad_swirles[220323]: 167 167
Dec 01 09:29:22 compute-0 podman[220306]: 2025-12-01 09:29:22.724731425 +0000 UTC m=+0.380483056 container attach 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:29:22 compute-0 systemd[1]: libpod-49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a.scope: Deactivated successfully.
Dec 01 09:29:22 compute-0 podman[220306]: 2025-12-01 09:29:22.729614556 +0000 UTC m=+0.385366207 container died 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:29:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-ac4c2638152228515c012ceb77ac5fc747b57c8854ef405a1384b4a73a3545ea-merged.mount: Deactivated successfully.
Dec 01 09:29:22 compute-0 podman[220306]: 2025-12-01 09:29:22.845889139 +0000 UTC m=+0.501640770 container remove 49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_swirles, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:29:22 compute-0 systemd[1]: libpod-conmon-49240aed0ce8c0a9d41194a86f4013f2ff65c51e6b2c3aad2a481907a32f866a.scope: Deactivated successfully.
Dec 01 09:29:23 compute-0 sudo[220514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prcsvoygpcdqcgxydtmipcbdznlmkrnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581362.7748485-1438-168151068000140/AnsiballZ_systemd.py'
Dec 01 09:29:23 compute-0 sudo[220514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:23 compute-0 podman[220452]: 2025-12-01 09:29:22.977914987 +0000 UTC m=+0.023279724 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:29:23 compute-0 podman[220452]: 2025-12-01 09:29:23.157525202 +0000 UTC m=+0.202889909 container create 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:29:23 compute-0 systemd[1]: Started libpod-conmon-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope.
Dec 01 09:29:23 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:29:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:29:23 compute-0 podman[220452]: 2025-12-01 09:29:23.263811326 +0000 UTC m=+0.309176063 container init 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec 01 09:29:23 compute-0 podman[220452]: 2025-12-01 09:29:23.272487777 +0000 UTC m=+0.317852484 container start 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:29:23 compute-0 python3.9[220516]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 01 09:29:23 compute-0 podman[220452]: 2025-12-01 09:29:23.336692594 +0000 UTC m=+0.382057331 container attach 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:29:23 compute-0 ceph-mon[75031]: pgmap v522: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:23 compute-0 systemd[1]: Reloading.
Dec 01 09:29:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v523: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:23 compute-0 systemd-rc-local-generator[220551]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:29:23 compute-0 systemd-sysv-generator[220555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:29:23 compute-0 systemd[1]: Reloading.
Dec 01 09:29:23 compute-0 systemd-sysv-generator[220588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:29:23 compute-0 systemd-rc-local-generator[220584]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:29:24 compute-0 sudo[220514]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:24 compute-0 keen_wozniak[220519]: {
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "osd_id": 0,
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "type": "bluestore"
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:     },
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "osd_id": 1,
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "type": "bluestore"
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:     },
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "osd_id": 2,
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:         "type": "bluestore"
Dec 01 09:29:24 compute-0 keen_wozniak[220519]:     }
Dec 01 09:29:24 compute-0 keen_wozniak[220519]: }
Dec 01 09:29:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:24 compute-0 systemd[1]: libpod-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope: Deactivated successfully.
Dec 01 09:29:24 compute-0 systemd[1]: libpod-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope: Consumed 1.057s CPU time.
Dec 01 09:29:24 compute-0 podman[220452]: 2025-12-01 09:29:24.345219285 +0000 UTC m=+1.390583992 container died 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:29:24 compute-0 ceph-mon[75031]: pgmap v523: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:24 compute-0 sshd-session[160021]: Connection closed by 192.168.122.30 port 41430
Dec 01 09:29:24 compute-0 sshd-session[160018]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:29:24 compute-0 systemd[1]: session-49.scope: Deactivated successfully.
Dec 01 09:29:24 compute-0 systemd[1]: session-49.scope: Consumed 3min 29.176s CPU time.
Dec 01 09:29:24 compute-0 systemd-logind[788]: Session 49 logged out. Waiting for processes to exit.
Dec 01 09:29:24 compute-0 systemd-logind[788]: Removed session 49.
Dec 01 09:29:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-1f906a6faa141318a3c68c444321de5dd4d57bf1a196498ed6159c5ea461e2f5-merged.mount: Deactivated successfully.
Dec 01 09:29:24 compute-0 podman[220452]: 2025-12-01 09:29:24.687975129 +0000 UTC m=+1.733339846 container remove 497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_wozniak, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:29:24 compute-0 systemd[1]: libpod-conmon-497bb4a79b271b72c4cedf3c8ccee7e76652a523f81a468417ad98d4518afa88.scope: Deactivated successfully.
Dec 01 09:29:24 compute-0 sudo[220207]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:29:24 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:29:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:29:24 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:29:24 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 95a0ece0-f8d3-4d5a-9897-fa9ce172ac75 does not exist
Dec 01 09:29:24 compute-0 sudo[220660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:29:24 compute-0 sudo[220660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:24 compute-0 sudo[220660]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:24 compute-0 sudo[220685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:29:24 compute-0 sudo[220685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:29:24 compute-0 sudo[220685]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:24 compute-0 podman[220709]: 2025-12-01 09:29:24.951557443 +0000 UTC m=+0.064936410 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 09:29:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v524: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:25 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:29:25 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:29:26 compute-0 ceph-mon[75031]: pgmap v524: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v525: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:28 compute-0 ceph-mon[75031]: pgmap v525: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v526: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:30 compute-0 sshd-session[220729]: Accepted publickey for zuul from 192.168.122.30 port 54424 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:29:30 compute-0 systemd-logind[788]: New session 50 of user zuul.
Dec 01 09:29:30 compute-0 systemd[1]: Started Session 50 of User zuul.
Dec 01 09:29:30 compute-0 sshd-session[220729]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:29:30 compute-0 ceph-mon[75031]: pgmap v526: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:31 compute-0 python3.9[220882]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:29:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v527: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:32 compute-0 python3.9[221036]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:29:32 compute-0 network[221053]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:29:32 compute-0 network[221054]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:29:32 compute-0 network[221055]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:29:32 compute-0 ceph-mon[75031]: pgmap v527: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v528: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:34 compute-0 ceph-mon[75031]: pgmap v528: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v529: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:36 compute-0 sudo[221325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ondxlkbixhcopymwiaazunkdmnuhropm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581376.2256517-47-193817743909380/AnsiballZ_setup.py'
Dec 01 09:29:36 compute-0 sudo[221325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:36 compute-0 python3.9[221327]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:29:36 compute-0 ceph-mon[75031]: pgmap v529: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:37 compute-0 sudo[221325]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v530: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:37 compute-0 sudo[221409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyltezifhajutymakmacfebjxnplxjif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581376.2256517-47-193817743909380/AnsiballZ_dnf.py'
Dec 01 09:29:37 compute-0 sudo[221409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:37 compute-0 python3.9[221411]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:29:39 compute-0 ceph-mon[75031]: pgmap v530: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v531: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:41 compute-0 ceph-mon[75031]: pgmap v531: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v532: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:43 compute-0 ceph-mon[75031]: pgmap v532: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:29:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:29:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:29:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:29:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:29:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:29:43 compute-0 sudo[221409]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v533: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:43 compute-0 sudo[221562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkjwxibiocrndxykdtusecefsmqxlwkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581383.315166-59-261452815296722/AnsiballZ_stat.py'
Dec 01 09:29:43 compute-0 sudo[221562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:43 compute-0 python3.9[221564]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:29:43 compute-0 sudo[221562]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:44 compute-0 sudo[221714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqolgwfcbyddulhlxdzwcfdnhlmlrlwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581384.3549225-69-113629361261635/AnsiballZ_command.py'
Dec 01 09:29:44 compute-0 sudo[221714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:45 compute-0 python3.9[221716]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:29:45 compute-0 ceph-mon[75031]: pgmap v533: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:45 compute-0 sudo[221714]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v534: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:45 compute-0 sudo[221867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etajqhecactnmgzmgbfqvgvoppjxyndz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581385.3181524-79-218435502290170/AnsiballZ_stat.py'
Dec 01 09:29:45 compute-0 sudo[221867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:45 compute-0 python3.9[221869]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:29:45 compute-0 sudo[221867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:46 compute-0 sudo[222019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnmrpbsjgfojiyhtouowruqbbkfgswuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581385.9776528-87-51477562372085/AnsiballZ_command.py'
Dec 01 09:29:46 compute-0 sudo[222019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:46 compute-0 python3.9[222021]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:29:46 compute-0 sudo[222019]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:46 compute-0 sudo[222172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybjclvvoclmrfuhmvvnwkjiztgehdfya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581386.6061609-95-64919500063399/AnsiballZ_stat.py'
Dec 01 09:29:46 compute-0 sudo[222172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:47 compute-0 python3.9[222174]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:47 compute-0 sudo[222172]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:47 compute-0 ceph-mon[75031]: pgmap v534: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v535: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:47 compute-0 sudo[222295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nckmdobjlwghlpvvkxeahrdpfgmkcyon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581386.6061609-95-64919500063399/AnsiballZ_copy.py'
Dec 01 09:29:47 compute-0 sudo[222295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:47 compute-0 python3.9[222297]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581386.6061609-95-64919500063399/.source.iscsi _original_basename=.d9ygdvds follow=False checksum=8b34ecf17114dfe93c0af71f0eb5d4d0f9d9b273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:47 compute-0 sudo[222295]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:48 compute-0 sudo[222447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouehvwqigtchqjowizqxypfrfxgkeppz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581388.0201318-110-228818585221726/AnsiballZ_file.py'
Dec 01 09:29:48 compute-0 sudo[222447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:48 compute-0 python3.9[222449]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:48 compute-0 sudo[222447]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:49 compute-0 ceph-mon[75031]: pgmap v535: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:49 compute-0 sudo[222599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlpnpbzccxoyfwndhnnhhasmvbaoranb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581388.8890333-118-63498511246058/AnsiballZ_lineinfile.py'
Dec 01 09:29:49 compute-0 sudo[222599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v536: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:49 compute-0 python3.9[222601]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:49 compute-0 sudo[222599]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:50 compute-0 sudo[222764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjybmyupzzmvhawcqryldkgdssoxsqyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581389.8133626-127-208669034625090/AnsiballZ_systemd_service.py'
Dec 01 09:29:50 compute-0 sudo[222764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:50 compute-0 podman[222725]: 2025-12-01 09:29:50.578467478 +0000 UTC m=+0.165997272 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:29:50 compute-0 python3.9[222771]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:29:50 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 01 09:29:50 compute-0 sudo[222764]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:51 compute-0 sudo[222931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtfbvobfvevrahpvfgqtezklprbzidax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581391.009134-135-225054074621531/AnsiballZ_systemd_service.py'
Dec 01 09:29:51 compute-0 sudo[222931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:51 compute-0 ceph-mon[75031]: pgmap v536: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v537: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:51 compute-0 python3.9[222933]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:29:51 compute-0 systemd[1]: Reloading.
Dec 01 09:29:51 compute-0 systemd-sysv-generator[222966]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:29:51 compute-0 systemd-rc-local-generator[222963]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:29:52 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 01 09:29:52 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 01 09:29:52 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Dec 01 09:29:52 compute-0 systemd[1]: Started Open-iSCSI.
Dec 01 09:29:52 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 01 09:29:52 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 01 09:29:52 compute-0 sudo[222931]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:52 compute-0 ceph-mon[75031]: pgmap v537: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:52 compute-0 sudo[223131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqdaklvgeidxisppwtnthlcutdzwdvdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581392.4848788-146-42453443030600/AnsiballZ_service_facts.py'
Dec 01 09:29:52 compute-0 sudo[223131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:52 compute-0 python3.9[223133]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:29:53 compute-0 network[223150]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:29:53 compute-0 network[223151]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:29:53 compute-0 network[223152]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:29:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v538: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:54 compute-0 ceph-mon[75031]: pgmap v538: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:55 compute-0 podman[223206]: 2025-12-01 09:29:55.051230985 +0000 UTC m=+0.055970259 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 01 09:29:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v539: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:56 compute-0 sudo[223131]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:56 compute-0 sudo[223440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlbzwgoaobxgswwlhldbvougupwaiiwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581396.572812-156-255903199047341/AnsiballZ_file.py'
Dec 01 09:29:56 compute-0 sudo[223440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:57 compute-0 python3.9[223442]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 09:29:57 compute-0 sudo[223440]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v540: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:57 compute-0 ceph-mon[75031]: pgmap v539: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:58 compute-0 sudo[223592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbraqnkswmtnsktemklmttlhwfmhxzph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581397.7786527-164-77927492098555/AnsiballZ_modprobe.py'
Dec 01 09:29:58 compute-0 sudo[223592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:58 compute-0 python3.9[223594]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 01 09:29:58 compute-0 sudo[223592]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:58 compute-0 sudo[223748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juhxvsqlytuwwoudrppckgnagksoorgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581398.5807467-172-228270595855104/AnsiballZ_stat.py'
Dec 01 09:29:58 compute-0 sudo[223748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:58 compute-0 ceph-mon[75031]: pgmap v540: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:59 compute-0 python3.9[223750]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:29:59 compute-0 sudo[223748]: pam_unix(sudo:session): session closed for user root
Dec 01 09:29:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:29:59 compute-0 sudo[223871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxsaptvgiloclyostglbkeaqsjtpzpjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581398.5807467-172-228270595855104/AnsiballZ_copy.py'
Dec 01 09:29:59 compute-0 sudo[223871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:29:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v541: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:29:59 compute-0 python3.9[223873]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581398.5807467-172-228270595855104/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:29:59 compute-0 sudo[223871]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:00 compute-0 sudo[224023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhkwadyxbzfskyhxjxslpnztshipxbpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581399.8463044-188-168636685490097/AnsiballZ_lineinfile.py'
Dec 01 09:30:00 compute-0 sudo[224023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:00 compute-0 python3.9[224025]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:00 compute-0 sudo[224023]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:00 compute-0 ceph-mon[75031]: pgmap v541: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:01 compute-0 sudo[224175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqcnyhnptgkobidrpzffqhbkhqsnsjij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581400.457653-196-188907933961219/AnsiballZ_systemd.py'
Dec 01 09:30:01 compute-0 sudo[224175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v542: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:01 compute-0 python3.9[224177]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:30:01 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 01 09:30:01 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 01 09:30:01 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 01 09:30:01 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 01 09:30:01 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 01 09:30:01 compute-0 sudo[224175]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:02 compute-0 sudo[224331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrkqwrazerwuvkzhudgygpaxakjtrytq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581401.8586476-204-281329809991551/AnsiballZ_file.py'
Dec 01 09:30:02 compute-0 sudo[224331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:02 compute-0 python3.9[224333]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:30:02 compute-0 sudo[224331]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:02 compute-0 sudo[224483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkmliaxvhzguyyeizztdiyigbfneieog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581402.5688303-213-67097250260262/AnsiballZ_stat.py'
Dec 01 09:30:02 compute-0 sudo[224483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:02 compute-0 ceph-mon[75031]: pgmap v542: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:03 compute-0 python3.9[224485]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:30:03 compute-0 sudo[224483]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v543: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:03 compute-0 sudo[224635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amugnzreyaqsqhhgcxrrrngnsuwhdalm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581403.2492046-222-254729294625946/AnsiballZ_stat.py'
Dec 01 09:30:03 compute-0 sudo[224635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:03 compute-0 python3.9[224637]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:30:03 compute-0 sudo[224635]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:04 compute-0 sudo[224787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skrtceqinnindkjjmijfqbyilwfdcibn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581403.929408-230-256672124374212/AnsiballZ_stat.py'
Dec 01 09:30:04 compute-0 sudo[224787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:04 compute-0 python3.9[224789]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:04 compute-0 sudo[224787]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:04 compute-0 sudo[224910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-budoycyahgnmzihkklrtwbxjfluaggni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581403.929408-230-256672124374212/AnsiballZ_copy.py'
Dec 01 09:30:04 compute-0 sudo[224910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:04 compute-0 python3.9[224912]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581403.929408-230-256672124374212/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:04 compute-0 sudo[224910]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:04 compute-0 ceph-mon[75031]: pgmap v543: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:05 compute-0 sudo[225062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmyehwtspjclvbhsdbqedrxgiwtueaay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581405.101181-245-143410100079660/AnsiballZ_command.py'
Dec 01 09:30:05 compute-0 sudo[225062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v544: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:05 compute-0 python3.9[225064]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:30:05 compute-0 sudo[225062]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:05 compute-0 sudo[225215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfmkavbzxwoqheusekqsrupvrlvkbmhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581405.7101483-253-187722274066300/AnsiballZ_lineinfile.py'
Dec 01 09:30:05 compute-0 sudo[225215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:06 compute-0 python3.9[225217]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:06 compute-0 sudo[225215]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:06 compute-0 sudo[225367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgastaghmimprrnubxtfdeduuibhqjov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581406.3646042-261-178558573348323/AnsiballZ_replace.py'
Dec 01 09:30:06 compute-0 sudo[225367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:06 compute-0 ceph-mon[75031]: pgmap v544: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:07 compute-0 python3.9[225369]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:07 compute-0 sudo[225367]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v545: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:07 compute-0 sudo[225519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aflecsrqfpwcikcxeiswuwepkoufgexc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581407.252183-269-96992150291095/AnsiballZ_replace.py'
Dec 01 09:30:07 compute-0 sudo[225519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:07 compute-0 python3.9[225521]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:07 compute-0 sudo[225519]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:08 compute-0 sudo[225671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duyqvsljvfusirxdauzjcrghdufwdyzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581407.9384592-278-121643980928786/AnsiballZ_lineinfile.py'
Dec 01 09:30:08 compute-0 sudo[225671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:08 compute-0 python3.9[225673]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:08 compute-0 sudo[225671]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:08 compute-0 sudo[225823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nymtugdugudafmejqgbyabklasrnlpyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581408.5340624-278-61831807755956/AnsiballZ_lineinfile.py'
Dec 01 09:30:08 compute-0 sudo[225823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:08 compute-0 ceph-mon[75031]: pgmap v545: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:08 compute-0 python3.9[225825]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:08 compute-0 sudo[225823]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:09 compute-0 sudo[225975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppfiriogemmywunzlqvmbstmanryzhns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581409.1103985-278-92584290022770/AnsiballZ_lineinfile.py'
Dec 01 09:30:09 compute-0 sudo[225975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v546: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:09 compute-0 python3.9[225977]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:09 compute-0 sudo[225975]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:09 compute-0 sudo[226127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mypopfwwnpyosnqdhtqswluwoxhdtovx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581409.6959324-278-77064643246492/AnsiballZ_lineinfile.py'
Dec 01 09:30:09 compute-0 sudo[226127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:10 compute-0 python3.9[226129]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:10 compute-0 sudo[226127]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:10 compute-0 sudo[226279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajhxcbikeojhbaaocfrmxymugupdtjmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581410.3336105-307-248349763518014/AnsiballZ_stat.py'
Dec 01 09:30:10 compute-0 sudo[226279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:10 compute-0 python3.9[226281]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:30:10 compute-0 sudo[226279]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:10 compute-0 ceph-mon[75031]: pgmap v546: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:11 compute-0 sudo[226433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzrmtknjjrxaqzzgartgdmcslfpsocss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581411.078994-315-180971957327741/AnsiballZ_file.py'
Dec 01 09:30:11 compute-0 sudo[226433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v547: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:11 compute-0 python3.9[226435]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:11 compute-0 sudo[226433]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:12 compute-0 sudo[226585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toflzwsdmkwjdlcglumjevrzfrqdlxyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581411.8244307-324-270469581793494/AnsiballZ_file.py'
Dec 01 09:30:12 compute-0 sudo[226585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:12 compute-0 python3.9[226587]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:30:12 compute-0 sudo[226585]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:12 compute-0 sudo[226737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqjqoaxlechrnwdcrxddnidrsvnfdmkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581412.579713-332-26599113439130/AnsiballZ_stat.py'
Dec 01 09:30:12 compute-0 sudo[226737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:12 compute-0 ceph-mon[75031]: pgmap v547: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:30:13
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['vms', 'backups', 'images', '.mgr', 'cephfs.cephfs.data', 'volumes', 'cephfs.cephfs.meta']
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:30:13 compute-0 python3.9[226739]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:13 compute-0 sudo[226737]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:30:13 compute-0 sudo[226815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anherkqufynuxjgoomzpnxlypbijuzox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581412.579713-332-26599113439130/AnsiballZ_file.py'
Dec 01 09:30:13 compute-0 sudo[226815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:13 compute-0 python3.9[226817]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:30:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v548: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:13 compute-0 sudo[226815]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:14 compute-0 sudo[226967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rymzqrhyoplbkduesqoibfmggmrljgqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581413.6986873-332-149013138447317/AnsiballZ_stat.py'
Dec 01 09:30:14 compute-0 sudo[226967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:14 compute-0 python3.9[226969]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:14 compute-0 sudo[226967]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:14 compute-0 sudo[227045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzzlracnuqrvdiwcwzmpovyhfjsvgyvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581413.6986873-332-149013138447317/AnsiballZ_file.py'
Dec 01 09:30:14 compute-0 sudo[227045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:14 compute-0 python3.9[227047]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:30:14 compute-0 sudo[227045]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:14 compute-0 ceph-mon[75031]: pgmap v548: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:15 compute-0 sudo[227197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hycpzefpgmpgffqeidgtzzbimfkkxqyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581414.9101117-355-244218858960529/AnsiballZ_file.py'
Dec 01 09:30:15 compute-0 sudo[227197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:15 compute-0 python3.9[227199]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:15 compute-0 sudo[227197]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v549: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:15 compute-0 sudo[227349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zadflqxgioxxdvbatbubrkjbvlxlhdph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581415.5879858-363-80529139436598/AnsiballZ_stat.py'
Dec 01 09:30:15 compute-0 sudo[227349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:16 compute-0 python3.9[227351]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:16 compute-0 sudo[227349]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:16 compute-0 sudo[227427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kslditztccqfphrhdlejqtfvmxevfhej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581415.5879858-363-80529139436598/AnsiballZ_file.py'
Dec 01 09:30:16 compute-0 sudo[227427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:16 compute-0 python3.9[227429]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:16 compute-0 sudo[227427]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:16 compute-0 sudo[227579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdkayuiomquaynwghhinkqeqslbwdayv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581416.650865-375-141529912274148/AnsiballZ_stat.py'
Dec 01 09:30:16 compute-0 sudo[227579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:17 compute-0 ceph-mon[75031]: pgmap v549: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:17 compute-0 python3.9[227581]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:17 compute-0 sudo[227579]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:17 compute-0 sudo[227657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhuyxblwbtgkftlssucjgwgsnudqqkuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581416.650865-375-141529912274148/AnsiballZ_file.py'
Dec 01 09:30:17 compute-0 sudo[227657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v550: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:17 compute-0 python3.9[227659]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:17 compute-0 sudo[227657]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:18 compute-0 sudo[227809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbufxdhvmyqgkiiokgqlrabqlvynxxvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581417.771321-387-43796870935608/AnsiballZ_systemd.py'
Dec 01 09:30:18 compute-0 sudo[227809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:18 compute-0 python3.9[227811]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:30:18 compute-0 systemd[1]: Reloading.
Dec 01 09:30:18 compute-0 systemd-rc-local-generator[227838]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:18 compute-0 systemd-sysv-generator[227842]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:30:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:30:18 compute-0 sudo[227809]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:19 compute-0 ceph-mon[75031]: pgmap v550: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:19 compute-0 sudo[227998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttvhyvmimanslbbmppiuxfndkzmfyadi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581418.905689-395-7213678734401/AnsiballZ_stat.py'
Dec 01 09:30:19 compute-0 sudo[227998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:19 compute-0 python3.9[228000]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:19 compute-0 sudo[227998]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v551: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:19 compute-0 sudo[228076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlwyoghvlvqzkuimvghuvhvelzgxgieg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581418.905689-395-7213678734401/AnsiballZ_file.py'
Dec 01 09:30:19 compute-0 sudo[228076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:19 compute-0 python3.9[228078]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:19 compute-0 sudo[228076]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:20 compute-0 sudo[228228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycquemgnrcjakunnqjkyofmqzqkyljae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581420.1259983-407-53572781737430/AnsiballZ_stat.py'
Dec 01 09:30:20 compute-0 sudo[228228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:30:20.463 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:30:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:30:20.464 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:30:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:30:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:30:20 compute-0 python3.9[228230]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:20 compute-0 sudo[228228]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:20 compute-0 sudo[228312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flpyiltnwrsyvwcdmrqjsesbiybokvzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581420.1259983-407-53572781737430/AnsiballZ_file.py'
Dec 01 09:30:20 compute-0 sudo[228312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:21 compute-0 podman[228280]: 2025-12-01 09:30:21.052734163 +0000 UTC m=+0.157044343 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:30:21 compute-0 ceph-mon[75031]: pgmap v551: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:21 compute-0 python3.9[228319]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:21 compute-0 sudo[228312]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v552: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:21 compute-0 sudo[228484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfbfpvcywdrlklcuonyuzoswylyzzazq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581421.3324366-419-251194107359502/AnsiballZ_systemd.py'
Dec 01 09:30:21 compute-0 sudo[228484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:22 compute-0 python3.9[228486]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:30:22 compute-0 systemd[1]: Reloading.
Dec 01 09:30:22 compute-0 systemd-rc-local-generator[228511]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:22 compute-0 systemd-sysv-generator[228515]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:22 compute-0 systemd[1]: Starting Create netns directory...
Dec 01 09:30:22 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:30:22 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:30:22 compute-0 systemd[1]: Finished Create netns directory.
Dec 01 09:30:22 compute-0 sudo[228484]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:23 compute-0 sudo[228677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghqtotzukwlfmsfwacelccpdkimcjmaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581422.7556095-429-220509453031177/AnsiballZ_file.py'
Dec 01 09:30:23 compute-0 sudo[228677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:23 compute-0 ceph-mon[75031]: pgmap v552: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:23 compute-0 python3.9[228679]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:30:23 compute-0 sudo[228677]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v553: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:23 compute-0 sudo[228829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hshapusaaimotbhpljvjsnaelwuydhcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581423.4307067-437-18414245863872/AnsiballZ_stat.py'
Dec 01 09:30:23 compute-0 sudo[228829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:23 compute-0 python3.9[228831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:23 compute-0 sudo[228829]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:24 compute-0 sudo[228952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipyndgndlsitvtdupyjanwlxhyxcpnvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581423.4307067-437-18414245863872/AnsiballZ_copy.py'
Dec 01 09:30:24 compute-0 sudo[228952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #30. Immutable memtables: 0.
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.347037) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 30
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424347089, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 992, "num_deletes": 251, "total_data_size": 962242, "memory_usage": 981456, "flush_reason": "Manual Compaction"}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #31: started
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424353519, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 31, "file_size": 585179, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11017, "largest_seqno": 12008, "table_properties": {"data_size": 581381, "index_size": 1514, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9682, "raw_average_key_size": 19, "raw_value_size": 573173, "raw_average_value_size": 1179, "num_data_blocks": 69, "num_entries": 486, "num_filter_entries": 486, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581325, "oldest_key_time": 1764581325, "file_creation_time": 1764581424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 31, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 6511 microseconds, and 2929 cpu microseconds.
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.353560) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #31: 585179 bytes OK
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.353581) [db/memtable_list.cc:519] [default] Level-0 commit table #31 started
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.354656) [db/memtable_list.cc:722] [default] Level-0 commit table #31: memtable #1 done
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.354671) EVENT_LOG_v1 {"time_micros": 1764581424354666, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.354691) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 957567, prev total WAL file size 957567, number of live WAL files 2.
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000027.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.355306) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353032' seq:0, type:0; will stop at (end)
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [31(571KB)], [29(5662KB)]
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424355333, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [31], "files_L6": [29], "score": -1, "input_data_size": 6383370, "oldest_snapshot_seqno": -1}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #32: 3199 keys, 4658455 bytes, temperature: kUnknown
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424387892, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 32, "file_size": 4658455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4636400, "index_size": 12986, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 74157, "raw_average_key_size": 23, "raw_value_size": 4578328, "raw_average_value_size": 1431, "num_data_blocks": 579, "num_entries": 3199, "num_filter_entries": 3199, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.388252) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 4658455 bytes
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.389824) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.1 rd, 142.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 5.5 +0.0 blob) out(4.4 +0.0 blob), read-write-amplify(18.9) write-amplify(8.0) OK, records in: 3669, records dropped: 470 output_compression: NoCompression
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.389862) EVENT_LOG_v1 {"time_micros": 1764581424389843, "job": 12, "event": "compaction_finished", "compaction_time_micros": 32720, "compaction_time_cpu_micros": 11797, "output_level": 6, "num_output_files": 1, "total_output_size": 4658455, "num_input_records": 3669, "num_output_records": 3199, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000031.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424390187, "job": 12, "event": "table_file_deletion", "file_number": 31}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581424392266, "job": 12, "event": "table_file_deletion", "file_number": 29}
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.355228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392401) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392404) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:30:24 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:30:24.392409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:30:24 compute-0 python3.9[228954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581423.4307067-437-18414245863872/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:30:24 compute-0 sudo[228952]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:24 compute-0 sudo[228999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:24 compute-0 sudo[228999]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:24 compute-0 sudo[228999]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 sudo[229050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:30:25 compute-0 sudo[229050]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:25 compute-0 sudo[229050]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 sudo[229101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:25 compute-0 sudo[229101]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:25 compute-0 sudo[229101]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 sudo[229153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:30:25 compute-0 sudo[229153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:25 compute-0 podman[229132]: 2025-12-01 09:30:25.15547645 +0000 UTC m=+0.072945061 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:30:25 compute-0 sudo[229223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npujeqmyxwfgtorubuipytmcnnysuqka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581424.9041533-454-107042086755940/AnsiballZ_file.py'
Dec 01 09:30:25 compute-0 sudo[229223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:25 compute-0 ceph-mon[75031]: pgmap v553: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:25 compute-0 python3.9[229225]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:30:25 compute-0 sudo[229223]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v554: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:25 compute-0 sudo[229153]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:30:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:30:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:30:25 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:30:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:30:25 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:30:25 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 2869960a-b803-437d-b91f-2975596ac662 does not exist
Dec 01 09:30:25 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 94adc5e7-6c60-49e4-9a61-fe86eff64826 does not exist
Dec 01 09:30:25 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 4353b2dc-5aaa-4544-838b-4bd1eff88270 does not exist
Dec 01 09:30:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:30:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:30:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:30:25 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:30:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:30:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:30:25 compute-0 sudo[229333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:25 compute-0 sudo[229333]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:25 compute-0 sudo[229333]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 sudo[229381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:30:25 compute-0 sudo[229381]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:25 compute-0 sudo[229381]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 sudo[229431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:25 compute-0 sudo[229431]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:25 compute-0 sudo[229431]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:25 compute-0 sudo[229479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvcauakuskwkqthdqqbfzjiikpzwnubn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581425.5845556-462-171766235447270/AnsiballZ_stat.py'
Dec 01 09:30:25 compute-0 sudo[229479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:25 compute-0 sudo[229484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:30:25 compute-0 sudo[229484]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:26 compute-0 python3.9[229483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:26 compute-0 sudo[229479]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:26 compute-0 podman[229591]: 2025-12-01 09:30:26.225003295 +0000 UTC m=+0.046903098 container create 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec 01 09:30:26 compute-0 systemd[1]: Started libpod-conmon-80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57.scope.
Dec 01 09:30:26 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:26 compute-0 podman[229591]: 2025-12-01 09:30:26.197045156 +0000 UTC m=+0.018944989 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:30:26 compute-0 podman[229591]: 2025-12-01 09:30:26.309612792 +0000 UTC m=+0.131512655 container init 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:30:26 compute-0 podman[229591]: 2025-12-01 09:30:26.316739978 +0000 UTC m=+0.138639791 container start 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:30:26 compute-0 podman[229591]: 2025-12-01 09:30:26.320397224 +0000 UTC m=+0.142297027 container attach 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:30:26 compute-0 dreamy_yonath[229633]: 167 167
Dec 01 09:30:26 compute-0 systemd[1]: libpod-80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57.scope: Deactivated successfully.
Dec 01 09:30:26 compute-0 podman[229591]: 2025-12-01 09:30:26.322568207 +0000 UTC m=+0.144468010 container died 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:30:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-f789f30a689397fb5ce232d15e68b9c5a22516a7691bbb16f1e7275c2e52d5fe-merged.mount: Deactivated successfully.
Dec 01 09:30:26 compute-0 ceph-mon[75031]: pgmap v554: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:30:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:30:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:30:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:30:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:30:26 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:30:26 compute-0 podman[229591]: 2025-12-01 09:30:26.369267577 +0000 UTC m=+0.191167380 container remove 80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:30:26 compute-0 systemd[1]: libpod-conmon-80ae15a06a44e64d9db81baff16af70425448507f28acfd79802f52a232f8e57.scope: Deactivated successfully.
Dec 01 09:30:26 compute-0 sudo[229697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcewdbgqauyeqdxdltpkxzriypotqjwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581425.5845556-462-171766235447270/AnsiballZ_copy.py'
Dec 01 09:30:26 compute-0 sudo[229697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:26 compute-0 python3.9[229701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581425.5845556-462-171766235447270/.source.json _original_basename=.12dpb14q follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:26 compute-0 podman[229708]: 2025-12-01 09:30:26.570902419 +0000 UTC m=+0.037148745 container create 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:30:26 compute-0 sudo[229697]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:26 compute-0 systemd[1]: Started libpod-conmon-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope.
Dec 01 09:30:26 compute-0 podman[229708]: 2025-12-01 09:30:26.555463763 +0000 UTC m=+0.021710119 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:30:26 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:26 compute-0 podman[229708]: 2025-12-01 09:30:26.680500929 +0000 UTC m=+0.146747275 container init 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:30:26 compute-0 podman[229708]: 2025-12-01 09:30:26.690443947 +0000 UTC m=+0.156690283 container start 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:30:26 compute-0 podman[229708]: 2025-12-01 09:30:26.693724192 +0000 UTC m=+0.159970548 container attach 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:30:27 compute-0 sudo[229879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubdxgqqwxzbnffitxpqzpmtqdgpchzie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581426.7684736-477-261121551615567/AnsiballZ_file.py'
Dec 01 09:30:27 compute-0 sudo[229879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:27 compute-0 python3.9[229881]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:27 compute-0 sudo[229879]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v555: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:27 compute-0 sudo[230039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meuhtimtpdnmolnimldvlkikhqouoejd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581427.463916-485-137576019181063/AnsiballZ_stat.py'
Dec 01 09:30:27 compute-0 sudo[230039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:28 compute-0 sudo[230039]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:28 compute-0 compassionate_shockley[229727]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:30:28 compute-0 compassionate_shockley[229727]: --> relative data size: 1.0
Dec 01 09:30:28 compute-0 compassionate_shockley[229727]: --> All data devices are unavailable
Dec 01 09:30:28 compute-0 systemd[1]: libpod-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope: Deactivated successfully.
Dec 01 09:30:28 compute-0 systemd[1]: libpod-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope: Consumed 1.122s CPU time.
Dec 01 09:30:28 compute-0 podman[229708]: 2025-12-01 09:30:28.350392338 +0000 UTC m=+1.816638664 container died 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:30:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-dbb0acbcc2a8ed7c023da20cd142e1a53cbcb348d5bb00a6b5651d33888fe6b5-merged.mount: Deactivated successfully.
Dec 01 09:30:28 compute-0 podman[229708]: 2025-12-01 09:30:28.407725046 +0000 UTC m=+1.873971382 container remove 7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_shockley, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:30:28 compute-0 systemd[1]: libpod-conmon-7827752fda338d4baacdb58836cf82f490f32260ca3761afd478160757c6831d.scope: Deactivated successfully.
Dec 01 09:30:28 compute-0 sudo[229484]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:28 compute-0 sudo[230146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:28 compute-0 sudo[230146]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:28 compute-0 sudo[230146]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:28 compute-0 sudo[230230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keglanjhwezmzlptkvupurkqrqqvvveb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581427.463916-485-137576019181063/AnsiballZ_copy.py'
Dec 01 09:30:28 compute-0 sudo[230230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:28 compute-0 sudo[230204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:30:28 compute-0 sudo[230204]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:28 compute-0 sudo[230204]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:28 compute-0 sudo[230245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:28 compute-0 sudo[230245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:28 compute-0 sudo[230245]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:28 compute-0 sudo[230270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:30:28 compute-0 sudo[230270]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:28 compute-0 sudo[230230]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:29 compute-0 ceph-mon[75031]: pgmap v555: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:29 compute-0 podman[230360]: 2025-12-01 09:30:29.046028159 +0000 UTC m=+0.057067882 container create 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:30:29 compute-0 systemd[1]: Started libpod-conmon-2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707.scope.
Dec 01 09:30:29 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:29 compute-0 podman[230360]: 2025-12-01 09:30:29.027162183 +0000 UTC m=+0.038201936 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:30:29 compute-0 podman[230360]: 2025-12-01 09:30:29.127244328 +0000 UTC m=+0.138284101 container init 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:30:29 compute-0 podman[230360]: 2025-12-01 09:30:29.1356201 +0000 UTC m=+0.146659813 container start 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:30:29 compute-0 hungry_bell[230387]: 167 167
Dec 01 09:30:29 compute-0 systemd[1]: libpod-2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707.scope: Deactivated successfully.
Dec 01 09:30:29 compute-0 podman[230360]: 2025-12-01 09:30:29.146925107 +0000 UTC m=+0.157964870 container attach 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:30:29 compute-0 podman[230360]: 2025-12-01 09:30:29.147329209 +0000 UTC m=+0.158368942 container died 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:30:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-b363f101e2ecbda5ae372ebc4c66af89306a96c15a199b4c818c6951afc16c88-merged.mount: Deactivated successfully.
Dec 01 09:30:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:29 compute-0 podman[230360]: 2025-12-01 09:30:29.37314326 +0000 UTC m=+0.384183013 container remove 2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_bell, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:30:29 compute-0 systemd[1]: libpod-conmon-2a190d098973965ca00414e4b7df55971e5fb5874a326f627cede9ec71236707.scope: Deactivated successfully.
Dec 01 09:30:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v556: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:29 compute-0 sudo[230530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtxpgmirkeemaqxmqzspkpuooftusyzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581429.0990434-502-26733516516859/AnsiballZ_container_config_data.py'
Dec 01 09:30:29 compute-0 sudo[230530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:29 compute-0 podman[230526]: 2025-12-01 09:30:29.634543491 +0000 UTC m=+0.073267240 container create ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:30:29 compute-0 systemd[1]: Started libpod-conmon-ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086.scope.
Dec 01 09:30:29 compute-0 podman[230526]: 2025-12-01 09:30:29.607353494 +0000 UTC m=+0.046077273 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:30:29 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:29 compute-0 podman[230526]: 2025-12-01 09:30:29.730740803 +0000 UTC m=+0.169464532 container init ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:30:29 compute-0 podman[230526]: 2025-12-01 09:30:29.739909898 +0000 UTC m=+0.178633617 container start ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:30:29 compute-0 podman[230526]: 2025-12-01 09:30:29.743011838 +0000 UTC m=+0.181735577 container attach ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec 01 09:30:29 compute-0 python3.9[230541]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 01 09:30:29 compute-0 sudo[230530]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:30 compute-0 priceless_morse[230548]: {
Dec 01 09:30:30 compute-0 priceless_morse[230548]:     "0": [
Dec 01 09:30:30 compute-0 priceless_morse[230548]:         {
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "devices": [
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "/dev/loop3"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             ],
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_name": "ceph_lv0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_size": "21470642176",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "name": "ceph_lv0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "tags": {
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cluster_name": "ceph",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.crush_device_class": "",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.encrypted": "0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osd_id": "0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.type": "block",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.vdo": "0"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             },
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "type": "block",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "vg_name": "ceph_vg0"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:         }
Dec 01 09:30:30 compute-0 priceless_morse[230548]:     ],
Dec 01 09:30:30 compute-0 priceless_morse[230548]:     "1": [
Dec 01 09:30:30 compute-0 priceless_morse[230548]:         {
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "devices": [
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "/dev/loop4"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             ],
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_name": "ceph_lv1",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_size": "21470642176",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "name": "ceph_lv1",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "tags": {
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cluster_name": "ceph",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.crush_device_class": "",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.encrypted": "0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osd_id": "1",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.type": "block",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.vdo": "0"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             },
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "type": "block",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "vg_name": "ceph_vg1"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:         }
Dec 01 09:30:30 compute-0 priceless_morse[230548]:     ],
Dec 01 09:30:30 compute-0 priceless_morse[230548]:     "2": [
Dec 01 09:30:30 compute-0 priceless_morse[230548]:         {
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "devices": [
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "/dev/loop5"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             ],
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_name": "ceph_lv2",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_size": "21470642176",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "name": "ceph_lv2",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "tags": {
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.cluster_name": "ceph",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.crush_device_class": "",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.encrypted": "0",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osd_id": "2",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.type": "block",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:                 "ceph.vdo": "0"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             },
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "type": "block",
Dec 01 09:30:30 compute-0 priceless_morse[230548]:             "vg_name": "ceph_vg2"
Dec 01 09:30:30 compute-0 priceless_morse[230548]:         }
Dec 01 09:30:30 compute-0 priceless_morse[230548]:     ]
Dec 01 09:30:30 compute-0 priceless_morse[230548]: }
Dec 01 09:30:30 compute-0 sudo[230706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvcrviikyqouxoraeupzosanwocqwseb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581430.0478854-511-169983609959701/AnsiballZ_container_config_hash.py'
Dec 01 09:30:30 compute-0 sudo[230706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:30 compute-0 systemd[1]: libpod-ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086.scope: Deactivated successfully.
Dec 01 09:30:30 compute-0 podman[230526]: 2025-12-01 09:30:30.530050732 +0000 UTC m=+0.968774461 container died ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:30:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-b422c0fe7471334c588dc443263eb3ab6f40a0457d5d81a8f3b0f0989adef1c3-merged.mount: Deactivated successfully.
Dec 01 09:30:30 compute-0 podman[230526]: 2025-12-01 09:30:30.588249056 +0000 UTC m=+1.026972795 container remove ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_morse, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:30:30 compute-0 systemd[1]: libpod-conmon-ef927269052ef7b6eeda906e165e82821c7636e08d1109c9ce7ccdacb9f1d086.scope: Deactivated successfully.
Dec 01 09:30:30 compute-0 sudo[230270]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:30 compute-0 sudo[230721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:30 compute-0 sudo[230721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:30 compute-0 sudo[230721]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:30 compute-0 python3.9[230708]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 09:30:30 compute-0 sudo[230706]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:30 compute-0 sudo[230746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:30:30 compute-0 sudo[230746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:30 compute-0 sudo[230746]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:30 compute-0 sudo[230771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:30 compute-0 sudo[230771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:30 compute-0 sudo[230771]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:30 compute-0 sudo[230820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:30:30 compute-0 sudo[230820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:31 compute-0 ceph-mon[75031]: pgmap v556: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:31 compute-0 podman[230937]: 2025-12-01 09:30:31.225816186 +0000 UTC m=+0.045984231 container create 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:30:31 compute-0 systemd[1]: Started libpod-conmon-74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f.scope.
Dec 01 09:30:31 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:31 compute-0 podman[230937]: 2025-12-01 09:30:31.202175922 +0000 UTC m=+0.022343997 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:30:31 compute-0 podman[230937]: 2025-12-01 09:30:31.307815677 +0000 UTC m=+0.127983712 container init 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True)
Dec 01 09:30:31 compute-0 podman[230937]: 2025-12-01 09:30:31.314926763 +0000 UTC m=+0.135094768 container start 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True)
Dec 01 09:30:31 compute-0 podman[230937]: 2025-12-01 09:30:31.31791989 +0000 UTC m=+0.138087925 container attach 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:30:31 compute-0 great_franklin[230974]: 167 167
Dec 01 09:30:31 compute-0 systemd[1]: libpod-74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f.scope: Deactivated successfully.
Dec 01 09:30:31 compute-0 podman[230937]: 2025-12-01 09:30:31.32310851 +0000 UTC m=+0.143276525 container died 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:30:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-5becdfe9ef8c9309a117519f59f92ba06f6a99d6f64f24bac130d3fc2cf7d6d6-merged.mount: Deactivated successfully.
Dec 01 09:30:31 compute-0 podman[230937]: 2025-12-01 09:30:31.37047857 +0000 UTC m=+0.190646585 container remove 74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_franklin, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:30:31 compute-0 systemd[1]: libpod-conmon-74a3d8cc49dab79424a2e8b32053107431014b3bfdbc9c1fdf91353ae2612d5f.scope: Deactivated successfully.
Dec 01 09:30:31 compute-0 sudo[231045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stqngcjxrdfzmjzgwgskumilpounkyvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581431.003447-520-90473805275076/AnsiballZ_podman_container_info.py'
Dec 01 09:30:31 compute-0 sudo[231045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v557: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:31 compute-0 podman[231053]: 2025-12-01 09:30:31.568370354 +0000 UTC m=+0.046301971 container create f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:30:31 compute-0 systemd[1]: Started libpod-conmon-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope.
Dec 01 09:30:31 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:31 compute-0 podman[231053]: 2025-12-01 09:30:31.54958751 +0000 UTC m=+0.027519147 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:30:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:31 compute-0 podman[231053]: 2025-12-01 09:30:31.657982516 +0000 UTC m=+0.135914163 container init f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:30:31 compute-0 podman[231053]: 2025-12-01 09:30:31.666268395 +0000 UTC m=+0.144200012 container start f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:30:31 compute-0 podman[231053]: 2025-12-01 09:30:31.669819348 +0000 UTC m=+0.147750985 container attach f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:30:31 compute-0 python3.9[231048]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 01 09:30:31 compute-0 sudo[231045]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]: {
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "osd_id": 0,
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "type": "bluestore"
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:     },
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "osd_id": 1,
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "type": "bluestore"
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:     },
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "osd_id": 2,
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:         "type": "bluestore"
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]:     }
Dec 01 09:30:32 compute-0 sleepy_shirley[231070]: }
Dec 01 09:30:32 compute-0 systemd[1]: libpod-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope: Deactivated successfully.
Dec 01 09:30:32 compute-0 systemd[1]: libpod-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope: Consumed 1.090s CPU time.
Dec 01 09:30:32 compute-0 podman[231053]: 2025-12-01 09:30:32.754496571 +0000 UTC m=+1.232428218 container died f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:30:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-eaeca87c70200b39d0b478696a395d2839d0740104be83dd09f8e55f0538b77d-merged.mount: Deactivated successfully.
Dec 01 09:30:32 compute-0 podman[231053]: 2025-12-01 09:30:32.819885342 +0000 UTC m=+1.297816959 container remove f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shirley, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:30:32 compute-0 systemd[1]: libpod-conmon-f0f4483ebfdf86af206a9c63735676e41c132240a6f8454ab220b77d9339a559.scope: Deactivated successfully.
Dec 01 09:30:32 compute-0 sudo[230820]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:30:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:30:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:30:32 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:30:32 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 19ebf54b-18f2-4d6a-9403-9db0675f9a39 does not exist
Dec 01 09:30:32 compute-0 sudo[231239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:30:32 compute-0 sudo[231239]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:32 compute-0 sudo[231239]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:33 compute-0 sudo[231335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmyrhnhxrygzfpzwlelcksldmcdusvns ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764581432.5607762-533-239169821776584/AnsiballZ_edpm_container_manage.py'
Dec 01 09:30:33 compute-0 sudo[231335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:33 compute-0 sudo[231291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:30:33 compute-0 sudo[231291]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:30:33 compute-0 ceph-mon[75031]: pgmap v557: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:30:33 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:30:33 compute-0 sudo[231291]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:33 compute-0 python3[231339]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 09:30:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v558: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:34 compute-0 podman[231354]: 2025-12-01 09:30:34.690601679 +0000 UTC m=+1.318458484 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 01 09:30:34 compute-0 podman[231414]: 2025-12-01 09:30:34.82753053 +0000 UTC m=+0.030451972 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 01 09:30:35 compute-0 podman[231414]: 2025-12-01 09:30:35.056086411 +0000 UTC m=+0.259007793 container create 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 01 09:30:35 compute-0 python3[231339]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 01 09:30:35 compute-0 ceph-mon[75031]: pgmap v558: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:35 compute-0 sudo[231335]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v559: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:35 compute-0 sudo[231601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qleqisopotrbgcwztnqnpmqagxhiaekd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581435.3898146-541-215509555544377/AnsiballZ_stat.py'
Dec 01 09:30:35 compute-0 sudo[231601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:35 compute-0 python3.9[231603]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:30:35 compute-0 sudo[231601]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:36 compute-0 sudo[231755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-angnkiagebrrytcvbtdynfmgcgzpfjra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581436.2036831-550-72853860141922/AnsiballZ_file.py'
Dec 01 09:30:36 compute-0 sudo[231755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:36 compute-0 python3.9[231757]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:36 compute-0 sudo[231755]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:36 compute-0 sudo[231831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fimzdcpowwthnkdshfmpdygganzhhuts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581436.2036831-550-72853860141922/AnsiballZ_stat.py'
Dec 01 09:30:36 compute-0 sudo[231831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:37 compute-0 python3.9[231833]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:30:37 compute-0 sudo[231831]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:37 compute-0 ceph-mon[75031]: pgmap v559: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v560: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:37 compute-0 sudo[231982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ainszrgahihfyzfevmfmhtanwafbkdgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581437.1419468-550-125027055710573/AnsiballZ_copy.py'
Dec 01 09:30:37 compute-0 sudo[231982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:37 compute-0 python3.9[231984]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764581437.1419468-550-125027055710573/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:37 compute-0 sudo[231982]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:38 compute-0 sudo[232058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgxktylnhltlehbprxslfcisvginrxvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581437.1419468-550-125027055710573/AnsiballZ_systemd.py'
Dec 01 09:30:38 compute-0 sudo[232058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:38 compute-0 python3.9[232060]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:30:38 compute-0 systemd[1]: Reloading.
Dec 01 09:30:38 compute-0 systemd-rc-local-generator[232082]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:38 compute-0 systemd-sysv-generator[232089]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:38 compute-0 sudo[232058]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:39 compute-0 sudo[232169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-przcmgwjlnahjckbnuyyrgwuebmsuzpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581437.1419468-550-125027055710573/AnsiballZ_systemd.py'
Dec 01 09:30:39 compute-0 sudo[232169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:39 compute-0 ceph-mon[75031]: pgmap v560: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:39 compute-0 python3.9[232171]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:30:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:39 compute-0 systemd[1]: Reloading.
Dec 01 09:30:39 compute-0 systemd-sysv-generator[232204]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:39 compute-0 systemd-rc-local-generator[232199]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v561: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:39 compute-0 systemd[1]: Starting multipathd container...
Dec 01 09:30:39 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:39 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.
Dec 01 09:30:39 compute-0 podman[232210]: 2025-12-01 09:30:39.873874049 +0000 UTC m=+0.130296100 container init 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec 01 09:30:39 compute-0 multipathd[232225]: + sudo -E kolla_set_configs
Dec 01 09:30:39 compute-0 podman[232210]: 2025-12-01 09:30:39.911408854 +0000 UTC m=+0.167830885 container start 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec 01 09:30:39 compute-0 sudo[232232]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 01 09:30:39 compute-0 podman[232210]: multipathd
Dec 01 09:30:39 compute-0 sudo[232232]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 09:30:39 compute-0 sudo[232232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 09:30:39 compute-0 systemd[1]: Started multipathd container.
Dec 01 09:30:39 compute-0 sudo[232169]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:39 compute-0 multipathd[232225]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 09:30:39 compute-0 multipathd[232225]: INFO:__main__:Validating config file
Dec 01 09:30:39 compute-0 multipathd[232225]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 09:30:39 compute-0 multipathd[232225]: INFO:__main__:Writing out command to execute
Dec 01 09:30:39 compute-0 sudo[232232]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:39 compute-0 multipathd[232225]: ++ cat /run_command
Dec 01 09:30:39 compute-0 multipathd[232225]: + CMD='/usr/sbin/multipathd -d'
Dec 01 09:30:39 compute-0 multipathd[232225]: + ARGS=
Dec 01 09:30:39 compute-0 multipathd[232225]: + sudo kolla_copy_cacerts
Dec 01 09:30:39 compute-0 sudo[232255]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 01 09:30:39 compute-0 sudo[232255]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 09:30:39 compute-0 sudo[232255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 09:30:39 compute-0 podman[232231]: 2025-12-01 09:30:39.992912642 +0000 UTC m=+0.072168259 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:30:39 compute-0 sudo[232255]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:39 compute-0 multipathd[232225]: Running command: '/usr/sbin/multipathd -d'
Dec 01 09:30:39 compute-0 multipathd[232225]: + [[ ! -n '' ]]
Dec 01 09:30:39 compute-0 multipathd[232225]: + . kolla_extend_start
Dec 01 09:30:39 compute-0 multipathd[232225]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 01 09:30:39 compute-0 multipathd[232225]: + umask 0022
Dec 01 09:30:39 compute-0 multipathd[232225]: + exec /usr/sbin/multipathd -d
Dec 01 09:30:40 compute-0 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-60c59a972f790ebc.service: Main process exited, code=exited, status=1/FAILURE
Dec 01 09:30:40 compute-0 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-60c59a972f790ebc.service: Failed with result 'exit-code'.
Dec 01 09:30:40 compute-0 multipathd[232225]: 3284.985264 | --------start up--------
Dec 01 09:30:40 compute-0 multipathd[232225]: 3284.985287 | read /etc/multipath.conf
Dec 01 09:30:40 compute-0 multipathd[232225]: 3284.992063 | path checkers start up
Dec 01 09:30:40 compute-0 python3.9[232414]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:30:41 compute-0 ceph-mon[75031]: pgmap v561: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:41 compute-0 sudo[232566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japycqikvzebeukmuepcjiaiztdmlome ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581440.8039825-586-183217516699396/AnsiballZ_command.py'
Dec 01 09:30:41 compute-0 sudo[232566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:41 compute-0 python3.9[232568]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:30:41 compute-0 sudo[232566]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v562: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:41 compute-0 sudo[232731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvmsqyhumllgobcgjyubytwihpyopfyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581441.6391766-594-126233978245115/AnsiballZ_systemd.py'
Dec 01 09:30:41 compute-0 sudo[232731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:42 compute-0 python3.9[232733]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:30:42 compute-0 systemd[1]: Stopping multipathd container...
Dec 01 09:30:42 compute-0 multipathd[232225]: 3287.407905 | exit (signal)
Dec 01 09:30:42 compute-0 multipathd[232225]: 3287.407990 | --------shut down-------
Dec 01 09:30:42 compute-0 systemd[1]: libpod-832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.scope: Deactivated successfully.
Dec 01 09:30:42 compute-0 podman[232737]: 2025-12-01 09:30:42.469557165 +0000 UTC m=+0.074051352 container died 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 09:30:42 compute-0 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-60c59a972f790ebc.timer: Deactivated successfully.
Dec 01 09:30:42 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.
Dec 01 09:30:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-userdata-shm.mount: Deactivated successfully.
Dec 01 09:30:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81-merged.mount: Deactivated successfully.
Dec 01 09:30:42 compute-0 podman[232737]: 2025-12-01 09:30:42.650628616 +0000 UTC m=+0.255122823 container cleanup 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec 01 09:30:42 compute-0 podman[232737]: multipathd
Dec 01 09:30:42 compute-0 podman[232764]: multipathd
Dec 01 09:30:42 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 01 09:30:42 compute-0 systemd[1]: Stopped multipathd container.
Dec 01 09:30:42 compute-0 systemd[1]: Starting multipathd container...
Dec 01 09:30:42 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:30:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50a1c402b0115b7d9031e9416a904b5d23245fa1d19d3ce51f831e7ed4fa7b81/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 09:30:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d.
Dec 01 09:30:42 compute-0 podman[232777]: 2025-12-01 09:30:42.897658654 +0000 UTC m=+0.142157372 container init 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:30:42 compute-0 multipathd[232792]: + sudo -E kolla_set_configs
Dec 01 09:30:42 compute-0 sudo[232798]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 01 09:30:42 compute-0 sudo[232798]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 09:30:42 compute-0 sudo[232798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 09:30:42 compute-0 podman[232777]: 2025-12-01 09:30:42.937897422 +0000 UTC m=+0.182396070 container start 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:30:42 compute-0 podman[232777]: multipathd
Dec 01 09:30:42 compute-0 systemd[1]: Started multipathd container.
Dec 01 09:30:42 compute-0 sudo[232731]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:42 compute-0 multipathd[232792]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 09:30:42 compute-0 multipathd[232792]: INFO:__main__:Validating config file
Dec 01 09:30:42 compute-0 multipathd[232792]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 09:30:42 compute-0 multipathd[232792]: INFO:__main__:Writing out command to execute
Dec 01 09:30:43 compute-0 sudo[232798]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:43 compute-0 multipathd[232792]: ++ cat /run_command
Dec 01 09:30:43 compute-0 multipathd[232792]: + CMD='/usr/sbin/multipathd -d'
Dec 01 09:30:43 compute-0 multipathd[232792]: + ARGS=
Dec 01 09:30:43 compute-0 multipathd[232792]: + sudo kolla_copy_cacerts
Dec 01 09:30:43 compute-0 sudo[232817]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 01 09:30:43 compute-0 sudo[232817]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 09:30:43 compute-0 sudo[232817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 09:30:43 compute-0 sudo[232817]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:43 compute-0 multipathd[232792]: + [[ ! -n '' ]]
Dec 01 09:30:43 compute-0 multipathd[232792]: + . kolla_extend_start
Dec 01 09:30:43 compute-0 multipathd[232792]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 01 09:30:43 compute-0 multipathd[232792]: Running command: '/usr/sbin/multipathd -d'
Dec 01 09:30:43 compute-0 multipathd[232792]: + umask 0022
Dec 01 09:30:43 compute-0 multipathd[232792]: + exec /usr/sbin/multipathd -d
Dec 01 09:30:43 compute-0 podman[232799]: 2025-12-01 09:30:43.044195821 +0000 UTC m=+0.089674462 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 09:30:43 compute-0 multipathd[232792]: 3288.014249 | --------start up--------
Dec 01 09:30:43 compute-0 multipathd[232792]: 3288.014413 | read /etc/multipath.conf
Dec 01 09:30:43 compute-0 multipathd[232792]: 3288.020230 | path checkers start up
Dec 01 09:30:43 compute-0 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-64c3927ea1fd61c3.service: Main process exited, code=exited, status=1/FAILURE
Dec 01 09:30:43 compute-0 systemd[1]: 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d-64c3927ea1fd61c3.service: Failed with result 'exit-code'.
Dec 01 09:30:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:30:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:30:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:30:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:30:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:30:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:30:43 compute-0 ceph-mon[75031]: pgmap v562: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:43 compute-0 sudo[232982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcxdzlqyeusorbclbbvblhfrseefbqls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581443.202365-602-30746877397698/AnsiballZ_file.py'
Dec 01 09:30:43 compute-0 sudo[232982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v563: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:43 compute-0 python3.9[232984]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:43 compute-0 sudo[232982]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:44 compute-0 sudo[233134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qukwdexpkozbkeonlrvwkcecttdsybec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581444.0647876-614-142436866258190/AnsiballZ_file.py'
Dec 01 09:30:44 compute-0 sudo[233134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:44 compute-0 python3.9[233136]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 09:30:44 compute-0 sudo[233134]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:45 compute-0 sudo[233286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agtucfvxkoeemolypjvzrtoxpbayweox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581444.8211424-622-273251050225263/AnsiballZ_modprobe.py'
Dec 01 09:30:45 compute-0 sudo[233286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:45 compute-0 ceph-mon[75031]: pgmap v563: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:45 compute-0 python3.9[233288]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 01 09:30:45 compute-0 kernel: Key type psk registered
Dec 01 09:30:45 compute-0 sudo[233286]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v564: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:45 compute-0 sudo[233449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cklinvggiswzbcnhtbwyheidypnurtbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581445.567507-630-248121160340352/AnsiballZ_stat.py'
Dec 01 09:30:45 compute-0 sudo[233449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:46 compute-0 python3.9[233451]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:30:46 compute-0 sudo[233449]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:46 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 01 09:30:46 compute-0 sudo[233573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsyjawtnexjaczotbvhteuyrulvmguyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581445.567507-630-248121160340352/AnsiballZ_copy.py'
Dec 01 09:30:46 compute-0 sudo[233573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:46 compute-0 python3.9[233575]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581445.567507-630-248121160340352/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:46 compute-0 sudo[233573]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:47 compute-0 ceph-mon[75031]: pgmap v564: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:47 compute-0 sudo[233725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnnguzoaeugclssuyayvbduxtygjsrtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581446.9179587-646-69932773515823/AnsiballZ_lineinfile.py'
Dec 01 09:30:47 compute-0 sudo[233725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:47 compute-0 python3.9[233727]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:47 compute-0 sudo[233725]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v565: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:47 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 01 09:30:47 compute-0 sudo[233878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayqswjdvevzurfqazfjogkyiqurjvxxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581447.6773636-654-246378866638362/AnsiballZ_systemd.py'
Dec 01 09:30:47 compute-0 sudo[233878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:48 compute-0 python3.9[233880]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:30:48 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 01 09:30:48 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 01 09:30:48 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 01 09:30:48 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 01 09:30:48 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 01 09:30:48 compute-0 sudo[233878]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:48 compute-0 sudo[234034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwqffscsdrlndmjfevsqqgolxdfwsgpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581448.6622875-662-108559543417983/AnsiballZ_dnf.py'
Dec 01 09:30:48 compute-0 sudo[234034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:49 compute-0 ceph-mon[75031]: pgmap v565: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:49 compute-0 python3.9[234036]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:30:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v566: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:51 compute-0 ceph-mon[75031]: pgmap v566: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v567: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:51 compute-0 systemd[1]: Reloading.
Dec 01 09:30:51 compute-0 systemd-rc-local-generator[234086]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:51 compute-0 systemd-sysv-generator[234096]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:51 compute-0 podman[234043]: 2025-12-01 09:30:51.82774614 +0000 UTC m=+0.105981581 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 01 09:30:52 compute-0 systemd[1]: Reloading.
Dec 01 09:30:52 compute-0 systemd-sysv-generator[234133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:52 compute-0 systemd-rc-local-generator[234127]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:52 compute-0 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 01 09:30:52 compute-0 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 01 09:30:52 compute-0 lvm[234179]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 01 09:30:52 compute-0 lvm[234179]: VG ceph_vg1 finished
Dec 01 09:30:52 compute-0 lvm[234178]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:30:52 compute-0 lvm[234178]: VG ceph_vg0 finished
Dec 01 09:30:52 compute-0 lvm[234180]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 01 09:30:52 compute-0 lvm[234180]: VG ceph_vg2 finished
Dec 01 09:30:52 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:30:52 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:30:52 compute-0 systemd[1]: Reloading.
Dec 01 09:30:52 compute-0 systemd-rc-local-generator[234235]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:52 compute-0 systemd-sysv-generator[234239]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:53 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:30:53 compute-0 ceph-mon[75031]: pgmap v567: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v568: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:53 compute-0 sudo[234034]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:54 compute-0 sudo[235481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onqlniwwlhbdcdcbhupsfbnjtxfbzbvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581453.8035727-670-48887859205805/AnsiballZ_systemd_service.py'
Dec 01 09:30:54 compute-0 sudo[235481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:54 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:30:54 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:30:54 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.744s CPU time.
Dec 01 09:30:54 compute-0 systemd[1]: run-r2f03deedab9b48618c1677bc4cb20a2a.service: Deactivated successfully.
Dec 01 09:30:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:54 compute-0 python3.9[235500]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:30:54 compute-0 iscsid[222973]: iscsid shutting down.
Dec 01 09:30:54 compute-0 systemd[1]: Stopping Open-iSCSI...
Dec 01 09:30:54 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Dec 01 09:30:54 compute-0 systemd[1]: Stopped Open-iSCSI.
Dec 01 09:30:54 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 01 09:30:54 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 01 09:30:54 compute-0 systemd[1]: Started Open-iSCSI.
Dec 01 09:30:54 compute-0 sudo[235481]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:55 compute-0 ceph-mon[75031]: pgmap v568: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:55 compute-0 python3.9[235678]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:30:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v569: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:55 compute-0 podman[235728]: 2025-12-01 09:30:55.964617118 +0000 UTC m=+0.067122852 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 09:30:56 compute-0 sudo[235852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzouteyxkassnnucgnifmaeiaeftypxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581455.883146-688-58598615928081/AnsiballZ_file.py'
Dec 01 09:30:56 compute-0 sudo[235852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:56 compute-0 python3.9[235854]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:30:56 compute-0 sudo[235852]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:57 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 01 09:30:57 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 01 09:30:57 compute-0 ceph-mon[75031]: pgmap v569: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:57 compute-0 sudo[236006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txfgbglnftivkpoiksctjywhvnqftlaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581456.7851133-699-256066168615588/AnsiballZ_systemd_service.py'
Dec 01 09:30:57 compute-0 sudo[236006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:30:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v570: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:57 compute-0 python3.9[236008]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:30:57 compute-0 systemd[1]: Reloading.
Dec 01 09:30:57 compute-0 systemd-rc-local-generator[236031]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:30:57 compute-0 systemd-sysv-generator[236035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:30:58 compute-0 sudo[236006]: pam_unix(sudo:session): session closed for user root
Dec 01 09:30:58 compute-0 ceph-mon[75031]: pgmap v570: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:30:58 compute-0 python3.9[236193]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:30:58 compute-0 network[236210]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:30:58 compute-0 network[236211]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:30:58 compute-0 network[236212]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:30:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:30:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v571: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:00 compute-0 ceph-mon[75031]: pgmap v571: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v572: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:01 compute-0 anacron[30901]: Job `cron.weekly' started
Dec 01 09:31:01 compute-0 anacron[30901]: Job `cron.weekly' terminated
Dec 01 09:31:02 compute-0 sudo[236487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pegqrqqcdhxyhjxyayziemokdhnaojaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581462.317806-718-49789175170593/AnsiballZ_systemd_service.py'
Dec 01 09:31:02 compute-0 sudo[236487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:02 compute-0 ceph-mon[75031]: pgmap v572: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:02 compute-0 python3.9[236489]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:02 compute-0 sudo[236487]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:03 compute-0 sudo[236640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjqxxefhtdjuocftofsdwwgmssdqtsgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581463.0732522-718-42079581301099/AnsiballZ_systemd_service.py'
Dec 01 09:31:03 compute-0 sudo[236640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v573: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:03 compute-0 python3.9[236642]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:03 compute-0 sudo[236640]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:04 compute-0 sudo[236793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reblhwqzxkncblsethhlvzivsvgaymvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581463.8719642-718-32863268984636/AnsiballZ_systemd_service.py'
Dec 01 09:31:04 compute-0 sudo[236793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:04 compute-0 python3.9[236795]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:04 compute-0 sudo[236793]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:04 compute-0 ceph-mon[75031]: pgmap v573: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:04 compute-0 sudo[236946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdpavhmnqjfneuvzhmehbqtjilmwzwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581464.666466-718-26283867088978/AnsiballZ_systemd_service.py'
Dec 01 09:31:04 compute-0 sudo[236946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:05 compute-0 python3.9[236948]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:05 compute-0 sudo[236946]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v574: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:05 compute-0 sudo[237099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngbtcdhwmgpnvaempmuahgndgnrnnmcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581465.5059505-718-259544180436221/AnsiballZ_systemd_service.py'
Dec 01 09:31:05 compute-0 sudo[237099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:06 compute-0 python3.9[237101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:06 compute-0 sudo[237099]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:06 compute-0 sudo[237252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oijccwzfgrkzoakdsesguqukwcsujlna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581466.3051627-718-226064930304961/AnsiballZ_systemd_service.py'
Dec 01 09:31:06 compute-0 sudo[237252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:06 compute-0 ceph-mon[75031]: pgmap v574: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:06 compute-0 python3.9[237254]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:06 compute-0 sudo[237252]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:07 compute-0 sudo[237405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwyaubynessjfutsvkjfbweymolhhphy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581467.0922425-718-252429697001796/AnsiballZ_systemd_service.py'
Dec 01 09:31:07 compute-0 sudo[237405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v575: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:07 compute-0 python3.9[237407]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:07 compute-0 sudo[237405]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:08 compute-0 sudo[237558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stulknekcljlqinntdmokuxwtzukhueu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581467.8892844-718-161076884153677/AnsiballZ_systemd_service.py'
Dec 01 09:31:08 compute-0 sudo[237558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:08 compute-0 python3.9[237560]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:31:08 compute-0 sudo[237558]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:09 compute-0 ceph-mon[75031]: pgmap v575: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:09 compute-0 sudo[237711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohcafyyhoeopfeyojtrtjrcqnxfqtyll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581468.930233-777-191444524863700/AnsiballZ_file.py'
Dec 01 09:31:09 compute-0 sudo[237711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:09 compute-0 python3.9[237713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:09 compute-0 sudo[237711]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v576: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:09 compute-0 sudo[237863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifizyezjpmueodumlqjaokqnengmkftm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581469.583659-777-2125004608229/AnsiballZ_file.py'
Dec 01 09:31:09 compute-0 sudo[237863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:10 compute-0 python3.9[237865]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:10 compute-0 sudo[237863]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:10 compute-0 sudo[238015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fobkypcvjzeuwhdlqclflcvjlqynchjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581470.2906778-777-168762839268126/AnsiballZ_file.py'
Dec 01 09:31:10 compute-0 sudo[238015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:10 compute-0 python3.9[238017]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:10 compute-0 sudo[238015]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:11 compute-0 ceph-mon[75031]: pgmap v576: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:11 compute-0 sudo[238167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tatnawiolhcxkrckqjeqcjzfstzpawzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581470.9673512-777-92840553679551/AnsiballZ_file.py'
Dec 01 09:31:11 compute-0 sudo[238167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:11 compute-0 python3.9[238169]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:11 compute-0 sudo[238167]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v577: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:11 compute-0 sudo[238319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjjoijqvhpnmhwggattcmspkxvcfubkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581471.6539407-777-83577621791547/AnsiballZ_file.py'
Dec 01 09:31:11 compute-0 sudo[238319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:12 compute-0 python3.9[238321]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:12 compute-0 sudo[238319]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:12 compute-0 sudo[238471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkhbdryrrdpwbsoadabkgyqmaoyqlyzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581472.401133-777-220312957100490/AnsiballZ_file.py'
Dec 01 09:31:12 compute-0 sudo[238471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:12 compute-0 python3.9[238473]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:12 compute-0 sudo[238471]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:31:13
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['volumes', 'backups', 'images', 'vms', '.mgr', 'cephfs.cephfs.data', 'cephfs.cephfs.meta']
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:31:13 compute-0 ceph-mon[75031]: pgmap v577: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:31:13 compute-0 sudo[238634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wstidhteceptqkrtdygsdbpwmjyfeeye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581472.9626813-777-162922019852278/AnsiballZ_file.py'
Dec 01 09:31:13 compute-0 sudo[238634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:13 compute-0 podman[238597]: 2025-12-01 09:31:13.285082909 +0000 UTC m=+0.068328747 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:31:13 compute-0 python3.9[238643]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:13 compute-0 sudo[238634]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v578: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:14 compute-0 sudo[238796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdvgzkkhzfvwybefwfhxplirpsexgaiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581473.7542698-777-222277281002376/AnsiballZ_file.py'
Dec 01 09:31:14 compute-0 sudo[238796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:14 compute-0 python3.9[238798]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:14 compute-0 sudo[238796]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:14 compute-0 sudo[238948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wthmkigzldpzrylkcqxgbugmumwpvkog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581474.4338956-834-190357619739452/AnsiballZ_file.py'
Dec 01 09:31:14 compute-0 sudo[238948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:14 compute-0 python3.9[238950]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:14 compute-0 sudo[238948]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:15 compute-0 ceph-mon[75031]: pgmap v578: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:15 compute-0 sudo[239100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmublftwkalaldaekusuktsefisqjyiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581475.070226-834-211108109327644/AnsiballZ_file.py'
Dec 01 09:31:15 compute-0 sudo[239100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v579: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:15 compute-0 python3.9[239102]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:15 compute-0 sudo[239100]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:16 compute-0 sudo[239252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvgvfhxypdyagfifcdyflioiprpqbtqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581475.7661839-834-252558328901844/AnsiballZ_file.py'
Dec 01 09:31:16 compute-0 sudo[239252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:16 compute-0 python3.9[239254]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:16 compute-0 sudo[239252]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:16 compute-0 sudo[239404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqvovmrmwlorhhwbzkhektkkuizajahg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581476.433437-834-30696101866862/AnsiballZ_file.py'
Dec 01 09:31:16 compute-0 sudo[239404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:16 compute-0 python3.9[239406]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:16 compute-0 sudo[239404]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:17 compute-0 ceph-mon[75031]: pgmap v579: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v580: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:17 compute-0 sudo[239556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itjvesgslhxyiavyhssyszyfhxpkvgyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581477.1044261-834-135056971293177/AnsiballZ_file.py'
Dec 01 09:31:17 compute-0 sudo[239556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:17 compute-0 python3.9[239558]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:17 compute-0 sudo[239556]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:18 compute-0 sudo[239708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhssqghdohfwjnxkreuaibuvhfrmqrim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581477.914123-834-210255098324091/AnsiballZ_file.py'
Dec 01 09:31:18 compute-0 sudo[239708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:18 compute-0 python3.9[239710]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:18 compute-0 sudo[239708]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:31:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:31:18 compute-0 sudo[239860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gorpocuuvyqhiktbhibowznhjaasvczt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581478.5873353-834-96587520491489/AnsiballZ_file.py'
Dec 01 09:31:18 compute-0 sudo[239860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:19 compute-0 ceph-mon[75031]: pgmap v580: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:19 compute-0 python3.9[239862]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:19 compute-0 sudo[239860]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v581: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:19 compute-0 sudo[240012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckxnxttshhakzczkifvkdvwgvgxdijef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581479.264352-834-77894695397869/AnsiballZ_file.py'
Dec 01 09:31:19 compute-0 sudo[240012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:19 compute-0 python3.9[240014]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:19 compute-0 sudo[240012]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:31:20.464 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:31:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:31:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:31:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:31:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:31:20 compute-0 sudo[240164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjrwvdesutsblszvhefcwwqxmkylrhji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581480.155523-892-172262975479305/AnsiballZ_command.py'
Dec 01 09:31:20 compute-0 sudo[240164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:20 compute-0 python3.9[240166]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:20 compute-0 sudo[240164]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:21 compute-0 ceph-mon[75031]: pgmap v581: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v582: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:21 compute-0 python3.9[240318]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:31:22 compute-0 sudo[240483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uknzjsdpqwvvxzfrcnomzofquvhgnyhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581482.0310311-910-35567362739790/AnsiballZ_systemd_service.py'
Dec 01 09:31:22 compute-0 sudo[240483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:22 compute-0 podman[240442]: 2025-12-01 09:31:22.400367523 +0000 UTC m=+0.097825596 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:31:22 compute-0 python3.9[240490]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:31:22 compute-0 systemd[1]: Reloading.
Dec 01 09:31:22 compute-0 systemd-rc-local-generator[240520]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:31:22 compute-0 systemd-sysv-generator[240524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:31:23 compute-0 sudo[240483]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:23 compute-0 ceph-mon[75031]: pgmap v582: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v583: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:23 compute-0 sudo[240682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhxakezmsarwxwqdgdrsfnlbatypifon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581483.2568214-918-10007074477312/AnsiballZ_command.py'
Dec 01 09:31:23 compute-0 sudo[240682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:23 compute-0 python3.9[240684]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:23 compute-0 sudo[240682]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:24 compute-0 sudo[240835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reemxyepfitmvrnpxmkeiatsomnmxndo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581483.918791-918-121831939914725/AnsiballZ_command.py'
Dec 01 09:31:24 compute-0 sudo[240835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:24 compute-0 python3.9[240837]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:24 compute-0 sudo[240835]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:24 compute-0 sudo[240988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plrognexouvzuapdnjxxmfeqmugyaers ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581484.5656836-918-117413034596037/AnsiballZ_command.py'
Dec 01 09:31:24 compute-0 sudo[240988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:24 compute-0 python3.9[240990]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:25 compute-0 sudo[240988]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:25 compute-0 ceph-mon[75031]: pgmap v583: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:25 compute-0 sudo[241141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szogajbakdrqmupihdusytsztxczdodb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581485.1691515-918-171974326465328/AnsiballZ_command.py'
Dec 01 09:31:25 compute-0 sudo[241141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v584: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:25 compute-0 python3.9[241143]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:25 compute-0 sudo[241141]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:26 compute-0 sudo[241294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmjhjszeecrvkmulokcxhkotblvrcucn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581485.7566457-918-134972812825932/AnsiballZ_command.py'
Dec 01 09:31:26 compute-0 sudo[241294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:26 compute-0 podman[241296]: 2025-12-01 09:31:26.112489761 +0000 UTC m=+0.057417433 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 01 09:31:26 compute-0 python3.9[241297]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:26 compute-0 sudo[241294]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:26 compute-0 sudo[241465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cttaffdpvrwfpipotrfkolmpsrlyhzvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581486.3693936-918-159438119534495/AnsiballZ_command.py'
Dec 01 09:31:26 compute-0 sudo[241465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:26 compute-0 python3.9[241467]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:26 compute-0 sudo[241465]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:27 compute-0 ceph-mon[75031]: pgmap v584: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:27 compute-0 sudo[241618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlxiltfiiiahjgnyvlzdasjnhjldtbwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581486.9768968-918-100517623437107/AnsiballZ_command.py'
Dec 01 09:31:27 compute-0 sudo[241618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:27 compute-0 python3.9[241620]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:27 compute-0 sudo[241618]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v585: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:27 compute-0 sudo[241771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rofhlzlzjjaopgjngebhccffgchygjyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581487.5775516-918-226113223044386/AnsiballZ_command.py'
Dec 01 09:31:27 compute-0 sudo[241771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:28 compute-0 python3.9[241773]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:31:28 compute-0 sudo[241771]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:29 compute-0 ceph-mon[75031]: pgmap v585: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:29 compute-0 sudo[241924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysrphejvucemqrixwnirsvmjphmtvobr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581488.9541328-997-202727487290698/AnsiballZ_file.py'
Dec 01 09:31:29 compute-0 sudo[241924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:29 compute-0 python3.9[241926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:29 compute-0 sudo[241924]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v586: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:29 compute-0 sudo[242076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gempfumhwgyivemlrxohqkdnfjvtncvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581489.59927-997-42729480204584/AnsiballZ_file.py'
Dec 01 09:31:29 compute-0 sudo[242076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:30 compute-0 python3.9[242078]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:30 compute-0 sudo[242076]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:30 compute-0 sudo[242228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyiarjobxbjhaljookvrwnkneevwisng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581490.2061515-997-167176122727827/AnsiballZ_file.py'
Dec 01 09:31:30 compute-0 sudo[242228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:30 compute-0 python3.9[242230]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:30 compute-0 sudo[242228]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:31 compute-0 sudo[242380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puhsaqestmxbxrimkvevbzkufivzfczp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581490.8492064-1019-69594718984302/AnsiballZ_file.py'
Dec 01 09:31:31 compute-0 sudo[242380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:31 compute-0 ceph-mon[75031]: pgmap v586: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:31 compute-0 python3.9[242382]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:31 compute-0 sudo[242380]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v587: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:31 compute-0 sudo[242532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ankttnaprsxsyciazbgnsyuiwzybglmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581491.4509745-1019-150489230897531/AnsiballZ_file.py'
Dec 01 09:31:31 compute-0 sudo[242532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:31 compute-0 python3.9[242534]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:31 compute-0 sudo[242532]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:32 compute-0 sudo[242684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnadouspjqxsfyvrhhtcqgburtwvheoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581492.121539-1019-178147280164923/AnsiballZ_file.py'
Dec 01 09:31:32 compute-0 sudo[242684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:32 compute-0 python3.9[242686]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:32 compute-0 sudo[242684]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[242836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpahryogiumqqwqpdoinzdbhbouvbbfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581492.7409415-1019-279757031043363/AnsiballZ_file.py'
Dec 01 09:31:33 compute-0 sudo[242836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:33 compute-0 sudo[242839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:33 compute-0 sudo[242839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:33 compute-0 sudo[242839]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[242864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:31:33 compute-0 sudo[242864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:33 compute-0 sudo[242864]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 ceph-mon[75031]: pgmap v587: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:33 compute-0 python3.9[242838]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:33 compute-0 sudo[242889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:33 compute-0 sudo[242889]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:33 compute-0 sudo[242889]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[242836]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[242914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:31:33 compute-0 sudo[242914]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v588: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:33 compute-0 sudo[243105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jojbpwoknanhapmxyqeqbtvcbrddymgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581493.3595848-1019-169157656853593/AnsiballZ_file.py'
Dec 01 09:31:33 compute-0 sudo[243105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:33 compute-0 sudo[242914]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:31:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:31:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:31:33 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:31:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:31:33 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:31:33 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 328781bf-619a-4e2d-8362-c884c9d853b4 does not exist
Dec 01 09:31:33 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 7a0e186b-c5ce-44cc-834b-1fecd670fe7f does not exist
Dec 01 09:31:33 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev b3cf4ed0-5a30-40a7-bf27-a6f2cbd2875c does not exist
Dec 01 09:31:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:31:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:31:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:31:33 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:31:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:31:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:31:33 compute-0 python3.9[243109]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:33 compute-0 sudo[243122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:33 compute-0 sudo[243122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:33 compute-0 sudo[243105]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[243122]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[243147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:31:33 compute-0 sudo[243147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:33 compute-0 sudo[243147]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[243196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:33 compute-0 sudo[243196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:33 compute-0 sudo[243196]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:33 compute-0 sudo[243244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:31:33 compute-0 sudo[243244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:31:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:31:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:31:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:31:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:31:34 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:31:34 compute-0 sudo[243396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyrrmizgixyhaovlzqqwpjrndhlqsblv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581493.949371-1019-185223801551764/AnsiballZ_file.py'
Dec 01 09:31:34 compute-0 sudo[243396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:34 compute-0 podman[243415]: 2025-12-01 09:31:34.298270658 +0000 UTC m=+0.038488178 container create 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:31:34 compute-0 systemd[1]: Started libpod-conmon-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope.
Dec 01 09:31:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:31:34 compute-0 podman[243415]: 2025-12-01 09:31:34.28165245 +0000 UTC m=+0.021870000 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:31:34 compute-0 podman[243415]: 2025-12-01 09:31:34.385164029 +0000 UTC m=+0.125381599 container init 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:31:34 compute-0 podman[243415]: 2025-12-01 09:31:34.394039174 +0000 UTC m=+0.134256714 container start 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:31:34 compute-0 podman[243415]: 2025-12-01 09:31:34.399301675 +0000 UTC m=+0.139519235 container attach 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:31:34 compute-0 keen_kalam[243431]: 167 167
Dec 01 09:31:34 compute-0 systemd[1]: libpod-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope: Deactivated successfully.
Dec 01 09:31:34 compute-0 conmon[243431]: conmon 7cbd28e6f58db1cec8e5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope/container/memory.events
Dec 01 09:31:34 compute-0 podman[243415]: 2025-12-01 09:31:34.401481728 +0000 UTC m=+0.141699268 container died 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:31:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-7997302c7d1ad48698deb4da87f2f6ac9fb2b98f52059da05c581182aaa5dc3d-merged.mount: Deactivated successfully.
Dec 01 09:31:34 compute-0 python3.9[243407]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:34 compute-0 podman[243415]: 2025-12-01 09:31:34.446649178 +0000 UTC m=+0.186866698 container remove 7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_kalam, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:31:34 compute-0 systemd[1]: libpod-conmon-7cbd28e6f58db1cec8e5f96df9c4afbc0c3fd035d26c57e256caa26587fd758b.scope: Deactivated successfully.
Dec 01 09:31:34 compute-0 sudo[243396]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:34 compute-0 podman[243486]: 2025-12-01 09:31:34.622633212 +0000 UTC m=+0.043795201 container create 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:31:34 compute-0 systemd[1]: Started libpod-conmon-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope.
Dec 01 09:31:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:31:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:34 compute-0 podman[243486]: 2025-12-01 09:31:34.6037936 +0000 UTC m=+0.024955619 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:31:34 compute-0 podman[243486]: 2025-12-01 09:31:34.709078589 +0000 UTC m=+0.130240668 container init 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Dec 01 09:31:34 compute-0 podman[243486]: 2025-12-01 09:31:34.717589304 +0000 UTC m=+0.138751303 container start 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:31:34 compute-0 podman[243486]: 2025-12-01 09:31:34.721707673 +0000 UTC m=+0.142869692 container attach 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:31:34 compute-0 sudo[243625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewnwehwezikkfaxtuzheugakrodnroxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581494.5927646-1019-114208338200326/AnsiballZ_file.py'
Dec 01 09:31:34 compute-0 sudo[243625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:35 compute-0 python3.9[243627]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:35 compute-0 sudo[243625]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:35 compute-0 ceph-mon[75031]: pgmap v588: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v589: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:35 compute-0 elegant_pasteur[243547]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:31:35 compute-0 elegant_pasteur[243547]: --> relative data size: 1.0
Dec 01 09:31:35 compute-0 elegant_pasteur[243547]: --> All data devices are unavailable
Dec 01 09:31:35 compute-0 systemd[1]: libpod-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope: Deactivated successfully.
Dec 01 09:31:35 compute-0 podman[243486]: 2025-12-01 09:31:35.753159052 +0000 UTC m=+1.174321051 container died 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:31:35 compute-0 systemd[1]: libpod-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope: Consumed 1.000s CPU time.
Dec 01 09:31:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-bdd467dc8298df01010ecb1bfa1fc253fb485e024864cd815d6107c68b255822-merged.mount: Deactivated successfully.
Dec 01 09:31:35 compute-0 podman[243486]: 2025-12-01 09:31:35.804582082 +0000 UTC m=+1.225744091 container remove 62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_pasteur, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:31:35 compute-0 systemd[1]: libpod-conmon-62498806cf4c758aa2772a6cc931cf65a94e2f25ce696fa3e3945a6c0c55d867.scope: Deactivated successfully.
Dec 01 09:31:35 compute-0 sudo[243244]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:35 compute-0 sudo[243688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:35 compute-0 sudo[243688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:35 compute-0 sudo[243688]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:35 compute-0 sudo[243713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:31:35 compute-0 sudo[243713]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:35 compute-0 sudo[243713]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:36 compute-0 sudo[243738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:36 compute-0 sudo[243738]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:36 compute-0 sudo[243738]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:36 compute-0 sudo[243763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:31:36 compute-0 sudo[243763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:36 compute-0 podman[243829]: 2025-12-01 09:31:36.450054296 +0000 UTC m=+0.041522806 container create 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:31:36 compute-0 systemd[1]: Started libpod-conmon-5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29.scope.
Dec 01 09:31:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:31:36 compute-0 podman[243829]: 2025-12-01 09:31:36.521482371 +0000 UTC m=+0.112950901 container init 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec 01 09:31:36 compute-0 podman[243829]: 2025-12-01 09:31:36.529350898 +0000 UTC m=+0.120819408 container start 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True)
Dec 01 09:31:36 compute-0 podman[243829]: 2025-12-01 09:31:36.433441188 +0000 UTC m=+0.024909718 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:31:36 compute-0 podman[243829]: 2025-12-01 09:31:36.532982712 +0000 UTC m=+0.124451252 container attach 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Dec 01 09:31:36 compute-0 gifted_wiles[243845]: 167 167
Dec 01 09:31:36 compute-0 systemd[1]: libpod-5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29.scope: Deactivated successfully.
Dec 01 09:31:36 compute-0 podman[243829]: 2025-12-01 09:31:36.535041001 +0000 UTC m=+0.126509511 container died 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:31:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-75f41480405acf7135bdc71fdb024b4069c4c85e95081da9febcd6c27170516f-merged.mount: Deactivated successfully.
Dec 01 09:31:36 compute-0 podman[243829]: 2025-12-01 09:31:36.571720727 +0000 UTC m=+0.163189237 container remove 5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_wiles, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:31:36 compute-0 systemd[1]: libpod-conmon-5caa3003f096f44a1d780c9c009d496350bef3f3100564368bafe2953e614e29.scope: Deactivated successfully.
Dec 01 09:31:36 compute-0 podman[243869]: 2025-12-01 09:31:36.744804587 +0000 UTC m=+0.043399029 container create a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:31:36 compute-0 systemd[1]: Started libpod-conmon-a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e.scope.
Dec 01 09:31:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:31:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:36 compute-0 podman[243869]: 2025-12-01 09:31:36.727063407 +0000 UTC m=+0.025657879 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:31:36 compute-0 podman[243869]: 2025-12-01 09:31:36.829155815 +0000 UTC m=+0.127750257 container init a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:31:36 compute-0 podman[243869]: 2025-12-01 09:31:36.835961171 +0000 UTC m=+0.134555613 container start a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec 01 09:31:36 compute-0 podman[243869]: 2025-12-01 09:31:36.838755961 +0000 UTC m=+0.137350403 container attach a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Dec 01 09:31:37 compute-0 ceph-mon[75031]: pgmap v589: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v590: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:37 compute-0 kind_ganguly[243886]: {
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:     "0": [
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:         {
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "devices": [
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "/dev/loop3"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             ],
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_name": "ceph_lv0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_size": "21470642176",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "name": "ceph_lv0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "tags": {
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cluster_name": "ceph",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.crush_device_class": "",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.encrypted": "0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osd_id": "0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.type": "block",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.vdo": "0"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             },
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "type": "block",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "vg_name": "ceph_vg0"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:         }
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:     ],
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:     "1": [
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:         {
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "devices": [
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "/dev/loop4"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             ],
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_name": "ceph_lv1",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_size": "21470642176",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "name": "ceph_lv1",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "tags": {
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cluster_name": "ceph",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.crush_device_class": "",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.encrypted": "0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osd_id": "1",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.type": "block",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.vdo": "0"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             },
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "type": "block",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "vg_name": "ceph_vg1"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:         }
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:     ],
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:     "2": [
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:         {
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "devices": [
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "/dev/loop5"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             ],
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_name": "ceph_lv2",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_size": "21470642176",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "name": "ceph_lv2",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "tags": {
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.cluster_name": "ceph",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.crush_device_class": "",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.encrypted": "0",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osd_id": "2",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.type": "block",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:                 "ceph.vdo": "0"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             },
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "type": "block",
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:             "vg_name": "ceph_vg2"
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:         }
Dec 01 09:31:37 compute-0 kind_ganguly[243886]:     ]
Dec 01 09:31:37 compute-0 kind_ganguly[243886]: }
Dec 01 09:31:37 compute-0 systemd[1]: libpod-a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e.scope: Deactivated successfully.
Dec 01 09:31:37 compute-0 podman[243869]: 2025-12-01 09:31:37.617903361 +0000 UTC m=+0.916497813 container died a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:31:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-42c25d8d5f2c358e4a5459c59cbe467416dafe9ac9857ba4f4554c46a131245d-merged.mount: Deactivated successfully.
Dec 01 09:31:37 compute-0 podman[243869]: 2025-12-01 09:31:37.673026277 +0000 UTC m=+0.971620709 container remove a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_ganguly, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:31:37 compute-0 systemd[1]: libpod-conmon-a734cd920d8a93ec1527d9c562eed1efa5319011745ee4666c13b1f8d64ade6e.scope: Deactivated successfully.
Dec 01 09:31:37 compute-0 sudo[243763]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:37 compute-0 sudo[243907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:37 compute-0 sudo[243907]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:37 compute-0 sudo[243907]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:37 compute-0 sudo[243932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:31:37 compute-0 sudo[243932]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:37 compute-0 sudo[243932]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:37 compute-0 sudo[243957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:37 compute-0 sudo[243957]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:37 compute-0 sudo[243957]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:37 compute-0 sudo[243982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:31:37 compute-0 sudo[243982]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:38 compute-0 podman[244047]: 2025-12-01 09:31:38.346462946 +0000 UTC m=+0.060837842 container create 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:31:38 compute-0 systemd[1]: Started libpod-conmon-8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01.scope.
Dec 01 09:31:38 compute-0 podman[244047]: 2025-12-01 09:31:38.324743191 +0000 UTC m=+0.039118077 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:31:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:31:38 compute-0 podman[244047]: 2025-12-01 09:31:38.442660934 +0000 UTC m=+0.157035810 container init 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:31:38 compute-0 podman[244047]: 2025-12-01 09:31:38.452432035 +0000 UTC m=+0.166806911 container start 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:31:38 compute-0 podman[244047]: 2025-12-01 09:31:38.456354778 +0000 UTC m=+0.170729634 container attach 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:31:38 compute-0 sweet_jennings[244063]: 167 167
Dec 01 09:31:38 compute-0 systemd[1]: libpod-8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01.scope: Deactivated successfully.
Dec 01 09:31:38 compute-0 podman[244047]: 2025-12-01 09:31:38.462166715 +0000 UTC m=+0.176541661 container died 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:31:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-2956d7cffda573b5985a5904a098e3b398e7c8c256fe2432238256910069cdd1-merged.mount: Deactivated successfully.
Dec 01 09:31:38 compute-0 podman[244047]: 2025-12-01 09:31:38.526367353 +0000 UTC m=+0.240742249 container remove 8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_jennings, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:31:38 compute-0 systemd[1]: libpod-conmon-8b676d489026d67d7f8a24bb344169a6a52f314ed386a70e1da668f960fc0f01.scope: Deactivated successfully.
Dec 01 09:31:38 compute-0 podman[244085]: 2025-12-01 09:31:38.722684672 +0000 UTC m=+0.056795205 container create 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:31:38 compute-0 systemd[1]: Started libpod-conmon-0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed.scope.
Dec 01 09:31:38 compute-0 podman[244085]: 2025-12-01 09:31:38.691460743 +0000 UTC m=+0.025571336 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:31:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:31:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:31:38 compute-0 podman[244085]: 2025-12-01 09:31:38.822117563 +0000 UTC m=+0.156228056 container init 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Dec 01 09:31:38 compute-0 podman[244085]: 2025-12-01 09:31:38.835321663 +0000 UTC m=+0.169432156 container start 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:31:38 compute-0 podman[244085]: 2025-12-01 09:31:38.838161035 +0000 UTC m=+0.172271548 container attach 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:31:39 compute-0 ceph-mon[75031]: pgmap v590: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v591: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:39 compute-0 intelligent_euler[244102]: {
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "osd_id": 0,
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "type": "bluestore"
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:     },
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "osd_id": 1,
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "type": "bluestore"
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:     },
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "osd_id": 2,
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:         "type": "bluestore"
Dec 01 09:31:39 compute-0 intelligent_euler[244102]:     }
Dec 01 09:31:39 compute-0 intelligent_euler[244102]: }
Dec 01 09:31:39 compute-0 systemd[1]: libpod-0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed.scope: Deactivated successfully.
Dec 01 09:31:39 compute-0 podman[244135]: 2025-12-01 09:31:39.849552847 +0000 UTC m=+0.029486089 container died 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:31:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-8a9ab0315d09837ebcd9ce95ff055860f90f4629967c7a6832f9b14e55d9a95c-merged.mount: Deactivated successfully.
Dec 01 09:31:39 compute-0 podman[244135]: 2025-12-01 09:31:39.904703494 +0000 UTC m=+0.084636646 container remove 0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_euler, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:31:39 compute-0 systemd[1]: libpod-conmon-0ecc457c4124a5013c8f64cca80b63b7b95e792824979ae1d1d7a871436a2fed.scope: Deactivated successfully.
Dec 01 09:31:39 compute-0 sudo[243982]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:31:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:31:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:31:39 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:31:39 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 222a10f9-4c30-4c00-bc67-6e5f56907cbc does not exist
Dec 01 09:31:40 compute-0 sudo[244173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:31:40 compute-0 sudo[244173]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:40 compute-0 sudo[244173]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:40 compute-0 sudo[244227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:31:40 compute-0 sudo[244227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:31:40 compute-0 sudo[244227]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:40 compute-0 sudo[244325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzlicshzmupfsiffnaulxziccaielnum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581499.9625099-1208-148700212883540/AnsiballZ_getent.py'
Dec 01 09:31:40 compute-0 sudo[244325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:40 compute-0 python3.9[244327]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 01 09:31:40 compute-0 sudo[244325]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:40 compute-0 ceph-mon[75031]: pgmap v591: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:31:40 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:31:41 compute-0 sudo[244478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnujhezomiksibvnonrwhyzzjkxoazdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581500.9424808-1216-219277161334136/AnsiballZ_group.py'
Dec 01 09:31:41 compute-0 sudo[244478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v592: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:41 compute-0 python3.9[244480]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:31:41 compute-0 groupadd[244481]: group added to /etc/group: name=nova, GID=42436
Dec 01 09:31:41 compute-0 groupadd[244481]: group added to /etc/gshadow: name=nova
Dec 01 09:31:41 compute-0 groupadd[244481]: new group: name=nova, GID=42436
Dec 01 09:31:41 compute-0 sudo[244478]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:42 compute-0 sudo[244636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxgurdmuyuthshxwbfmtbsnmnlpxtaup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581501.8598404-1224-204679182665082/AnsiballZ_user.py'
Dec 01 09:31:42 compute-0 sudo[244636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:42 compute-0 python3.9[244638]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 09:31:42 compute-0 useradd[244640]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 01 09:31:42 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:31:42 compute-0 useradd[244640]: add 'nova' to group 'libvirt'
Dec 01 09:31:42 compute-0 useradd[244640]: add 'nova' to shadow group 'libvirt'
Dec 01 09:31:42 compute-0 sudo[244636]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:42 compute-0 ceph-mon[75031]: pgmap v592: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:31:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:31:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:31:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:31:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:31:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:31:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v593: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:43 compute-0 sshd-session[244672]: Accepted publickey for zuul from 192.168.122.30 port 34540 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:31:43 compute-0 systemd-logind[788]: New session 51 of user zuul.
Dec 01 09:31:43 compute-0 systemd[1]: Started Session 51 of User zuul.
Dec 01 09:31:43 compute-0 sshd-session[244672]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:31:43 compute-0 podman[244674]: 2025-12-01 09:31:43.829752408 +0000 UTC m=+0.095260532 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:31:43 compute-0 sshd-session[244681]: Received disconnect from 192.168.122.30 port 34540:11: disconnected by user
Dec 01 09:31:43 compute-0 sshd-session[244681]: Disconnected from user zuul 192.168.122.30 port 34540
Dec 01 09:31:43 compute-0 sshd-session[244672]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:31:43 compute-0 systemd[1]: session-51.scope: Deactivated successfully.
Dec 01 09:31:43 compute-0 systemd-logind[788]: Session 51 logged out. Waiting for processes to exit.
Dec 01 09:31:43 compute-0 systemd-logind[788]: Removed session 51.
Dec 01 09:31:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:44 compute-0 python3.9[244846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:44 compute-0 ceph-mon[75031]: pgmap v593: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:45 compute-0 python3.9[244967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581504.0485396-1249-175207014057183/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v594: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:45 compute-0 python3.9[245117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:46 compute-0 python3.9[245193]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:46 compute-0 python3.9[245343]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:46 compute-0 ceph-mon[75031]: pgmap v594: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:47 compute-0 python3.9[245464]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581506.1904726-1249-127275162569772/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v595: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:47 compute-0 python3.9[245614]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:48 compute-0 python3.9[245735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581507.3064673-1249-30708396262785/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:48 compute-0 python3.9[245885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:48 compute-0 ceph-mon[75031]: pgmap v595: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:49 compute-0 python3.9[246006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581508.5360897-1249-255925187640533/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v596: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:50 compute-0 python3.9[246156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:50 compute-0 python3.9[246277]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581509.6196575-1249-240847757232484/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:51 compute-0 ceph-mon[75031]: pgmap v596: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:51 compute-0 sudo[246427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnbbhiwyhdbfgyexcwaqopblqfyzlguj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581510.7748675-1332-63428265631967/AnsiballZ_file.py'
Dec 01 09:31:51 compute-0 sudo[246427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:51 compute-0 python3.9[246429]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:51 compute-0 sudo[246427]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v597: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:51 compute-0 sudo[246579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udenjhmbyltjudshwomigknuxdnkqaqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581511.3994172-1340-107609948626129/AnsiballZ_copy.py'
Dec 01 09:31:51 compute-0 sudo[246579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:51 compute-0 python3.9[246581]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:31:51 compute-0 sudo[246579]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:52 compute-0 sudo[246731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wccndweayihxxeexipuooatdcplewbbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581512.0721416-1348-258324886142826/AnsiballZ_stat.py'
Dec 01 09:31:52 compute-0 sudo[246731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:52 compute-0 python3.9[246733]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:31:52 compute-0 sudo[246731]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:52 compute-0 sudo[246893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvenwbmgnsvdylopwpejnxrrnghqvzwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581512.6393037-1356-255026109672473/AnsiballZ_stat.py'
Dec 01 09:31:52 compute-0 sudo[246893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:52 compute-0 podman[246857]: 2025-12-01 09:31:52.981829952 +0000 UTC m=+0.091943117 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 01 09:31:53 compute-0 ceph-mon[75031]: pgmap v597: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:53 compute-0 python3.9[246900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:53 compute-0 sudo[246893]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:53 compute-0 sudo[247032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwjhcjluclvfnulcmjrklbolkpjtadfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581512.6393037-1356-255026109672473/AnsiballZ_copy.py'
Dec 01 09:31:53 compute-0 sudo[247032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v598: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:53 compute-0 python3.9[247034]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764581512.6393037-1356-255026109672473/.source _original_basename=.nsdtykkl follow=False checksum=99e1f0f07cd296b27b32892c19cd6590291d74e3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 01 09:31:53 compute-0 sudo[247032]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:54 compute-0 python3.9[247186]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:31:55 compute-0 ceph-mon[75031]: pgmap v598: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:55 compute-0 python3.9[247338]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:55 compute-0 python3.9[247459]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581514.602423-1382-248618224208453/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v599: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:56 compute-0 python3.9[247609]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:31:56 compute-0 podman[247704]: 2025-12-01 09:31:56.553248342 +0000 UTC m=+0.051618876 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:31:56 compute-0 python3.9[247743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764581515.7262225-1397-255598607723688/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:31:57 compute-0 ceph-mon[75031]: pgmap v599: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v600: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:57 compute-0 sudo[247899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thgxyirajcwtgzsckhowmcjqbiizvccz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581517.6705468-1414-172203115963501/AnsiballZ_container_config_data.py'
Dec 01 09:31:57 compute-0 sudo[247899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:58 compute-0 python3.9[247901]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 01 09:31:58 compute-0 sudo[247899]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:58 compute-0 sudo[248051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xegvpvigxbuablttvpmexapwtdrgtoof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581518.449519-1423-171528741536562/AnsiballZ_container_config_hash.py'
Dec 01 09:31:58 compute-0 sudo[248051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:58 compute-0 python3.9[248053]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 09:31:58 compute-0 sudo[248051]: pam_unix(sudo:session): session closed for user root
Dec 01 09:31:59 compute-0 ceph-mon[75031]: pgmap v600: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:31:59 compute-0 sudo[248203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmbnqrjplsvudvwjknhjzfudzlzqfvrt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764581519.2175531-1433-223683860789373/AnsiballZ_edpm_container_manage.py'
Dec 01 09:31:59 compute-0 sudo[248203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:31:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v601: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:31:59 compute-0 python3[248205]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 09:32:01 compute-0 ceph-mon[75031]: pgmap v601: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v602: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:03 compute-0 ceph-mon[75031]: pgmap v602: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v603: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:05 compute-0 ceph-mon[75031]: pgmap v603: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v604: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:07 compute-0 ceph-mon[75031]: pgmap v604: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v605: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:09 compute-0 ceph-mon[75031]: pgmap v605: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:09 compute-0 podman[248219]: 2025-12-01 09:32:09.244430055 +0000 UTC m=+9.441464531 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 01 09:32:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:09 compute-0 podman[248317]: 2025-12-01 09:32:09.431961021 +0000 UTC m=+0.094397817 container create b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, container_name=nova_compute_init, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 01 09:32:09 compute-0 podman[248317]: 2025-12-01 09:32:09.357970352 +0000 UTC m=+0.020407138 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 01 09:32:09 compute-0 python3[248205]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 01 09:32:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v606: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:09 compute-0 sudo[248203]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:10 compute-0 sudo[248505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjbfqjjqlvzaplfdtwiphoojocsokevh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581529.9333081-1441-271282375652327/AnsiballZ_stat.py'
Dec 01 09:32:10 compute-0 sudo[248505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:10 compute-0 python3.9[248507]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:32:10 compute-0 sudo[248505]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:11 compute-0 sudo[248659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvznvsxxlrioysrcnfshupqsqjgxcopa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581530.846791-1453-163115898775501/AnsiballZ_container_config_data.py'
Dec 01 09:32:11 compute-0 sudo[248659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:11 compute-0 ceph-mon[75031]: pgmap v606: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:11 compute-0 python3.9[248661]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 01 09:32:11 compute-0 sudo[248659]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v607: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:11 compute-0 sudo[248811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teronjfkllbdhdhknoigjrfpoumavmux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581531.5032945-1462-255410106384571/AnsiballZ_container_config_hash.py'
Dec 01 09:32:11 compute-0 sudo[248811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:11 compute-0 python3.9[248813]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 09:32:12 compute-0 sudo[248811]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:12 compute-0 sudo[248963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzysxyfecowngicgkbjffqlndvlqctlt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764581532.3072138-1472-118368702248312/AnsiballZ_edpm_container_manage.py'
Dec 01 09:32:12 compute-0 sudo[248963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:12 compute-0 python3[248965]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:32:13
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.data', 'cephfs.cephfs.meta', 'volumes', 'vms', '.mgr', 'images', 'backups']
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:32:13 compute-0 podman[249000]: 2025-12-01 09:32:13.094683397 +0000 UTC m=+0.079029106 container create a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute)
Dec 01 09:32:13 compute-0 podman[249000]: 2025-12-01 09:32:13.039588501 +0000 UTC m=+0.023934260 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 01 09:32:13 compute-0 python3[248965]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 01 09:32:13 compute-0 ceph-mon[75031]: pgmap v607: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:32:13 compute-0 sudo[248963]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v608: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:13 compute-0 sudo[249188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krcgtsejyuhnhcdtlektwjstnsxzttoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581533.4264765-1480-200546176684060/AnsiballZ_stat.py'
Dec 01 09:32:13 compute-0 sudo[249188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:13 compute-0 podman[249191]: 2025-12-01 09:32:13.988406614 +0000 UTC m=+0.074125914 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:32:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:15 compute-0 ceph-mon[75031]: pgmap v608: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:15 compute-0 python3.9[249190]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:32:15 compute-0 sudo[249188]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v609: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:15 compute-0 sudo[249364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axmpsdxivubwdutehcwbliwvectycege ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581535.5552363-1489-145291728943056/AnsiballZ_file.py'
Dec 01 09:32:15 compute-0 sudo[249364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:15 compute-0 python3.9[249366]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:32:16 compute-0 sudo[249364]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:16 compute-0 sudo[249515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkwnkabfcufrlnqyhwhshhhjraomyyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581536.0675564-1489-72346236261883/AnsiballZ_copy.py'
Dec 01 09:32:16 compute-0 sudo[249515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:16 compute-0 python3.9[249517]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764581536.0675564-1489-72346236261883/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:32:16 compute-0 sudo[249515]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:16 compute-0 sudo[249591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itbwjsjtxpqglirpylaabpzdgvfkvtai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581536.0675564-1489-72346236261883/AnsiballZ_systemd.py'
Dec 01 09:32:16 compute-0 sudo[249591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:17 compute-0 python3.9[249593]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:32:17 compute-0 systemd[1]: Reloading.
Dec 01 09:32:17 compute-0 ceph-mon[75031]: pgmap v609: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:17 compute-0 systemd-rc-local-generator[249621]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:32:17 compute-0 systemd-sysv-generator[249624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:32:17 compute-0 sudo[249591]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v610: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:17 compute-0 sudo[249702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezkrikzjnzzsslmghejmalwfpqrpujr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581536.0675564-1489-72346236261883/AnsiballZ_systemd.py'
Dec 01 09:32:17 compute-0 sudo[249702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:18 compute-0 python3.9[249704]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:32:18 compute-0 systemd[1]: Reloading.
Dec 01 09:32:18 compute-0 systemd-rc-local-generator[249734]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:32:18 compute-0 systemd-sysv-generator[249737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:32:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:32:18 compute-0 systemd[1]: Starting nova_compute container...
Dec 01 09:32:18 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:18 compute-0 podman[249744]: 2025-12-01 09:32:18.716856926 +0000 UTC m=+0.125684778 container init a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 01 09:32:18 compute-0 podman[249744]: 2025-12-01 09:32:18.726092751 +0000 UTC m=+0.134920563 container start a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:32:18 compute-0 podman[249744]: nova_compute
Dec 01 09:32:18 compute-0 nova_compute[249760]: + sudo -E kolla_set_configs
Dec 01 09:32:18 compute-0 systemd[1]: Started nova_compute container.
Dec 01 09:32:18 compute-0 sudo[249702]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Validating config file
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying service configuration files
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Deleting /etc/ceph
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Creating directory /etc/ceph
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Writing out command to execute
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:18 compute-0 nova_compute[249760]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 09:32:18 compute-0 nova_compute[249760]: ++ cat /run_command
Dec 01 09:32:18 compute-0 nova_compute[249760]: + CMD=nova-compute
Dec 01 09:32:18 compute-0 nova_compute[249760]: + ARGS=
Dec 01 09:32:18 compute-0 nova_compute[249760]: + sudo kolla_copy_cacerts
Dec 01 09:32:18 compute-0 nova_compute[249760]: + [[ ! -n '' ]]
Dec 01 09:32:18 compute-0 nova_compute[249760]: + . kolla_extend_start
Dec 01 09:32:18 compute-0 nova_compute[249760]: + echo 'Running command: '\''nova-compute'\'''
Dec 01 09:32:18 compute-0 nova_compute[249760]: Running command: 'nova-compute'
Dec 01 09:32:18 compute-0 nova_compute[249760]: + umask 0022
Dec 01 09:32:18 compute-0 nova_compute[249760]: + exec nova-compute
Dec 01 09:32:19 compute-0 ceph-mon[75031]: pgmap v610: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v611: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:19 compute-0 python3.9[249921]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:32:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:32:20.465 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:32:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:32:20.466 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:32:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:32:20.467 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:32:20 compute-0 python3.9[250072]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.137 249764 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.138 249764 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.138 249764 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.138 249764 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 01 09:32:21 compute-0 ceph-mon[75031]: pgmap v611: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.288 249764 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:32:21 compute-0 python3.9[250224]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.314 249764 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.314 249764 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 01 09:32:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v612: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:21 compute-0 nova_compute[249760]: 2025-12-01 09:32:21.990 249764 INFO nova.virt.driver [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.130 249764 INFO nova.compute.provider_config [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.146 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.146 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.147 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.147 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.148 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.149 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.150 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.151 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.152 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.153 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.154 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.155 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.156 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.157 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.158 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.159 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.160 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.161 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.162 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.163 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.164 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.165 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.166 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.167 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.168 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.169 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.170 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.171 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.172 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.173 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.174 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.175 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.176 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.177 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.178 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.179 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.180 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.181 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.182 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.183 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.184 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.185 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.186 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.187 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.188 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.189 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.190 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.191 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.192 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.193 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.194 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.195 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.196 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.197 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.198 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.199 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.200 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.201 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.202 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.203 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.204 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.205 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.206 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.207 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.208 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.209 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.210 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.211 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.212 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.213 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.214 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.215 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.216 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.217 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.218 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.219 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.220 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.221 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.222 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.223 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.224 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.225 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 WARNING oslo_config.cfg [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 01 09:32:22 compute-0 nova_compute[249760]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 01 09:32:22 compute-0 nova_compute[249760]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 01 09:32:22 compute-0 nova_compute[249760]: and ``live_migration_inbound_addr`` respectively.
Dec 01 09:32:22 compute-0 nova_compute[249760]: ).  Its value may be silently ignored in the future.
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.226 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.227 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.228 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_secret_uuid        = 5620a9fb-e540-5250-a0e8-7aaad5347e3b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.229 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.230 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.231 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.232 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.233 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 sudo[250376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oprsdsnuremxkqomjynkmoitujjhyurd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581541.5828881-1549-30799919111923/AnsiballZ_podman_container.py'
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.234 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.235 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.236 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.237 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.238 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 sudo[250376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.239 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.240 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.241 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.242 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.243 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.244 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.245 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.246 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.247 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.248 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.249 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.250 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.251 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.252 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.253 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.254 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.255 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.256 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.257 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.258 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.259 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.260 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.261 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.262 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.263 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.264 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.265 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.266 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.267 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.268 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.269 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.270 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.271 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.272 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.273 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.274 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.275 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.276 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.277 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.278 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.279 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.280 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.281 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.282 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.283 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.284 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.285 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.286 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.287 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.288 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.289 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.290 249764 DEBUG oslo_service.service [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.292 249764 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.312 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.313 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.313 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.313 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 01 09:32:22 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 01 09:32:22 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.402 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4d0fde95b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.406 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4d0fde95b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.407 249764 INFO nova.virt.libvirt.driver [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Connection event '1' reason 'None'
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.433 249764 WARNING nova.virt.libvirt.driver [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 01 09:32:22 compute-0 nova_compute[249760]: 2025-12-01 09:32:22.434 249764 DEBUG nova.virt.libvirt.volume.mount [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 01 09:32:22 compute-0 python3.9[250378]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 01 09:32:22 compute-0 sudo[250376]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:22 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:32:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:32:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3004 writes, 12K keys, 3004 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.01 MB/s
                                           Cumulative WAL: 3004 writes, 3004 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1278 writes, 5303 keys, 1278 commit groups, 1.0 writes per commit group, ingest: 5.67 MB, 0.01 MB/s
                                           Interval WAL: 1278 writes, 1278 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    109.6      0.09              0.04         6    0.015       0      0       0.0       0.0
                                             L6      1/0    4.44 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.4    150.7    122.6      0.19              0.09         5    0.038     16K   2271       0.0       0.0
                                            Sum      1/0    4.44 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.4    102.7    118.5      0.28              0.12        11    0.026     16K   2271       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.4    130.7    133.8      0.14              0.05         6    0.023     10K   1498       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    150.7    122.6      0.19              0.09         5    0.038     16K   2271       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    112.2      0.09              0.04         5    0.017       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.010, interval 0.004
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.03 GB write, 0.03 MB/s write, 0.03 GB read, 0.02 MB/s read, 0.3 seconds
                                           Interval compaction: 0.02 GB write, 0.03 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55bbd56b51f0#2 capacity: 308.00 MB usage: 1.30 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 6.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(81,1.15 MB,0.374767%) FilterBlock(12,52.61 KB,0.0166806%) IndexBlock(12,99.75 KB,0.0316273%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 09:32:23 compute-0 sudo[250630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uysbbdjahvnblpgzdpkabdlxdopirfvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581542.8319564-1557-275115873184033/AnsiballZ_systemd.py'
Dec 01 09:32:23 compute-0 sudo[250630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:23 compute-0 podman[250566]: 2025-12-01 09:32:23.153174682 +0000 UTC m=+0.094428538 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 01 09:32:23 compute-0 ceph-mon[75031]: pgmap v612: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.357 249764 INFO nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host capabilities <capabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]: 
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <host>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <uuid>52310927-1d30-4bda-9d2b-fd9f7cfadc4d</uuid>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <arch>x86_64</arch>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model>EPYC-Rome-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <vendor>AMD</vendor>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <microcode version='16777317'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <signature family='23' model='49' stepping='0'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='x2apic'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='tsc-deadline'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='osxsave'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='hypervisor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='tsc_adjust'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='spec-ctrl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='stibp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='arch-capabilities'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='cmp_legacy'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='topoext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='virt-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='lbrv'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='tsc-scale'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='vmcb-clean'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='pause-filter'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='pfthreshold'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='svme-addr-chk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='rdctl-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='skip-l1dfl-vmentry'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='mds-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature name='pschange-mc-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <pages unit='KiB' size='4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <pages unit='KiB' size='2048'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <pages unit='KiB' size='1048576'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <power_management>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <suspend_mem/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </power_management>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <iommu support='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <migration_features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <live/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <uri_transports>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <uri_transport>tcp</uri_transport>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <uri_transport>rdma</uri_transport>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </uri_transports>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </migration_features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <topology>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <cells num='1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <cell id='0'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           <memory unit='KiB'>7864320</memory>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           <pages unit='KiB' size='4'>1966080</pages>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           <pages unit='KiB' size='2048'>0</pages>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           <distances>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <sibling id='0' value='10'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           </distances>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           <cpus num='8'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:           </cpus>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         </cell>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </cells>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </topology>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <cache>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </cache>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <secmodel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model>selinux</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <doi>0</doi>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </secmodel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <secmodel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model>dac</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <doi>0</doi>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </secmodel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </host>
Dec 01 09:32:23 compute-0 nova_compute[249760]: 
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <guest>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <os_type>hvm</os_type>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <arch name='i686'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <wordsize>32</wordsize>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <domain type='qemu'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <domain type='kvm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </arch>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <pae/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <nonpae/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <acpi default='on' toggle='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <apic default='on' toggle='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <cpuselection/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <deviceboot/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <disksnapshot default='on' toggle='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <externalSnapshot/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </guest>
Dec 01 09:32:23 compute-0 nova_compute[249760]: 
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <guest>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <os_type>hvm</os_type>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <arch name='x86_64'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <wordsize>64</wordsize>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <domain type='qemu'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <domain type='kvm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </arch>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <acpi default='on' toggle='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <apic default='on' toggle='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <cpuselection/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <deviceboot/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <disksnapshot default='on' toggle='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <externalSnapshot/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </guest>
Dec 01 09:32:23 compute-0 nova_compute[249760]: 
Dec 01 09:32:23 compute-0 nova_compute[249760]: </capabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]: 
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.365 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.389 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 01 09:32:23 compute-0 nova_compute[249760]: <domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <domain>kvm</domain>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <arch>i686</arch>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <vcpu max='4096'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <iothreads supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <os supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='firmware'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <loader supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>rom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pflash</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='readonly'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>yes</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='secure'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </loader>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </os>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='maximumMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <vendor>AMD</vendor>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='succor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='custom' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-128'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-256'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-512'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:23 compute-0 python3.9[250637]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <memoryBacking supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='sourceType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>anonymous</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>memfd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </memoryBacking>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <disk supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='diskDevice'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>disk</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cdrom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>floppy</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>lun</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>fdc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>sata</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </disk>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <graphics supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vnc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egl-headless</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </graphics>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <video supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='modelType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vga</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cirrus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>none</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>bochs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ramfb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </video>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hostdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='mode'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>subsystem</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='startupPolicy'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>mandatory</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>requisite</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>optional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='subsysType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pci</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='capsType'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='pciBackend'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hostdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <rng supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>random</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </rng>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <filesystem supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='driverType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>path</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>handle</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtiofs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </filesystem>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <tpm supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-tis</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-crb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emulator</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>external</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendVersion'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>2.0</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </tpm>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <redirdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </redirdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <channel supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </channel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <crypto supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </crypto>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <interface supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>passt</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </interface>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <panic supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>isa</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>hyperv</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </panic>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <console supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>null</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dev</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pipe</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stdio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>udp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tcp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu-vdagent</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </console>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <gic supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <genid supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backup supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <async-teardown supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <ps2 supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sev supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sgx supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hyperv supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='features'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>relaxed</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vapic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>spinlocks</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vpindex</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>runtime</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>synic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stimer</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reset</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vendor_id</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>frequencies</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reenlightenment</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tlbflush</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ipi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>avic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emsr_bitmap</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>xmm_input</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hyperv>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <launchSecurity supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='sectype'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tdx</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </launchSecurity>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </features>
Dec 01 09:32:23 compute-0 nova_compute[249760]: </domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.398 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 01 09:32:23 compute-0 nova_compute[249760]: <domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <domain>kvm</domain>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <arch>i686</arch>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <vcpu max='240'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <iothreads supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <os supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='firmware'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <loader supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>rom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pflash</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='readonly'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>yes</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='secure'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </loader>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </os>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='maximumMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <vendor>AMD</vendor>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='succor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='custom' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 systemd[1]: Stopping nova_compute container...
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-128'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-256'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-512'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <memoryBacking supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='sourceType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>anonymous</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>memfd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </memoryBacking>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <disk supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='diskDevice'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>disk</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cdrom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>floppy</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>lun</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ide</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>fdc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>sata</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </disk>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <graphics supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vnc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egl-headless</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </graphics>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <video supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='modelType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vga</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cirrus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>none</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>bochs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ramfb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </video>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hostdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='mode'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>subsystem</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='startupPolicy'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>mandatory</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>requisite</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>optional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='subsysType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pci</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='capsType'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='pciBackend'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hostdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <rng supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>random</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </rng>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <filesystem supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='driverType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>path</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>handle</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtiofs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </filesystem>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <tpm supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-tis</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-crb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emulator</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>external</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendVersion'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>2.0</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </tpm>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <redirdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </redirdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <channel supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </channel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <crypto supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </crypto>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <interface supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>passt</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </interface>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <panic supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>isa</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>hyperv</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </panic>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <console supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>null</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dev</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pipe</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stdio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>udp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tcp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu-vdagent</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </console>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <gic supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <genid supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backup supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <async-teardown supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <ps2 supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sev supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sgx supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hyperv supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='features'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>relaxed</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vapic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>spinlocks</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vpindex</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>runtime</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>synic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stimer</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reset</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vendor_id</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>frequencies</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reenlightenment</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tlbflush</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ipi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>avic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emsr_bitmap</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>xmm_input</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hyperv>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <launchSecurity supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='sectype'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tdx</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </launchSecurity>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </features>
Dec 01 09:32:23 compute-0 nova_compute[249760]: </domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.434 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.438 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 01 09:32:23 compute-0 nova_compute[249760]: <domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <domain>kvm</domain>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <arch>x86_64</arch>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <vcpu max='4096'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <iothreads supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <os supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='firmware'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>efi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <loader supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>rom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pflash</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='readonly'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>yes</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='secure'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>yes</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </loader>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </os>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='maximumMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <vendor>AMD</vendor>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='succor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='custom' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-128'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-256'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-512'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo'>
Dec 01 09:32:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v613: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <memoryBacking supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='sourceType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>anonymous</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>memfd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </memoryBacking>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <disk supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='diskDevice'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>disk</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cdrom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>floppy</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>lun</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>fdc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>sata</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </disk>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <graphics supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vnc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egl-headless</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </graphics>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <video supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='modelType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vga</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cirrus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>none</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>bochs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ramfb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </video>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hostdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='mode'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>subsystem</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='startupPolicy'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>mandatory</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>requisite</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>optional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='subsysType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pci</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='capsType'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='pciBackend'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hostdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <rng supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>random</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </rng>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <filesystem supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='driverType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>path</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>handle</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtiofs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </filesystem>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <tpm supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-tis</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-crb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emulator</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>external</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendVersion'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>2.0</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </tpm>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <redirdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </redirdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <channel supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </channel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <crypto supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </crypto>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <interface supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>passt</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </interface>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <panic supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>isa</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>hyperv</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </panic>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <console supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>null</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dev</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pipe</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stdio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>udp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tcp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu-vdagent</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </console>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <gic supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <genid supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backup supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <async-teardown supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <ps2 supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sev supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sgx supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hyperv supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='features'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>relaxed</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vapic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>spinlocks</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vpindex</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>runtime</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>synic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stimer</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reset</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vendor_id</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>frequencies</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reenlightenment</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tlbflush</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ipi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>avic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emsr_bitmap</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>xmm_input</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hyperv>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <launchSecurity supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='sectype'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tdx</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </launchSecurity>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </features>
Dec 01 09:32:23 compute-0 nova_compute[249760]: </domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.507 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 01 09:32:23 compute-0 nova_compute[249760]: <domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <domain>kvm</domain>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <arch>x86_64</arch>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <vcpu max='240'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <iothreads supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <os supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='firmware'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <loader supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>rom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pflash</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='readonly'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>yes</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='secure'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>no</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </loader>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </os>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='maximumMigratable'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>on</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>off</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <vendor>AMD</vendor>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='succor'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <mode name='custom' supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Denverton-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='auto-ibrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amd-psfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='stibp-always-on'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='EPYC-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-128'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-256'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx10-512'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='prefetchiti'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Haswell-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512er'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512pf'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fma4'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tbm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xop'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='amx-tile'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-bf16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-fp16'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bitalg'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrc'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fzrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='la57'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='taa-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xfd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ifma'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cmpccxadd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fbsdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='fsrs'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ibrs-all'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mcdt-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pbrsb-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='psdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='serialize'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vaes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='hle'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='rtm'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512bw'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512cd'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512dq'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512f'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='avx512vl'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='invpcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pcid'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='pku'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='mpx'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='core-capability'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='split-lock-detect'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='cldemote'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='erms'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='gfni'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdir64b'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='movdiri'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='xsaves'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='athlon-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='core2duo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='coreduo-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='n270-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='ss'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <blockers model='phenom-v1'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnow'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <feature name='3dnowext'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </blockers>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </mode>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </cpu>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <memoryBacking supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <enum name='sourceType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>anonymous</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <value>memfd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </memoryBacking>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <disk supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='diskDevice'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>disk</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cdrom</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>floppy</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>lun</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ide</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>fdc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>sata</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </disk>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <graphics supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vnc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egl-headless</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </graphics>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <video supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='modelType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vga</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>cirrus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>none</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>bochs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ramfb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </video>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hostdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='mode'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>subsystem</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='startupPolicy'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>mandatory</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>requisite</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>optional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='subsysType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pci</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>scsi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='capsType'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='pciBackend'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hostdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <rng supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtio-non-transitional</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>random</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>egd</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </rng>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <filesystem supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='driverType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>path</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>handle</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>virtiofs</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </filesystem>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <tpm supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-tis</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tpm-crb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emulator</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>external</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendVersion'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>2.0</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </tpm>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <redirdev supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='bus'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>usb</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </redirdev>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <channel supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </channel>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <crypto supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendModel'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>builtin</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </crypto>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <interface supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='backendType'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>default</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>passt</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </interface>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <panic supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='model'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>isa</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>hyperv</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </panic>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <console supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='type'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>null</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vc</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pty</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dev</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>file</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>pipe</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stdio</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>udp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tcp</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>unix</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>qemu-vdagent</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>dbus</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </console>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </devices>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   <features>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <gic supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <genid supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <backup supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <async-teardown supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <ps2 supported='yes'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sev supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <sgx supported='no'/>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <hyperv supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='features'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>relaxed</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vapic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>spinlocks</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vpindex</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>runtime</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>synic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>stimer</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reset</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>vendor_id</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>frequencies</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>reenlightenment</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tlbflush</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>ipi</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>avic</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>emsr_bitmap</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>xmm_input</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </defaults>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </hyperv>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     <launchSecurity supported='yes'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       <enum name='sectype'>
Dec 01 09:32:23 compute-0 nova_compute[249760]:         <value>tdx</value>
Dec 01 09:32:23 compute-0 nova_compute[249760]:       </enum>
Dec 01 09:32:23 compute-0 nova_compute[249760]:     </launchSecurity>
Dec 01 09:32:23 compute-0 nova_compute[249760]:   </features>
Dec 01 09:32:23 compute-0 nova_compute[249760]: </domainCapabilities>
Dec 01 09:32:23 compute-0 nova_compute[249760]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.568 249764 DEBUG nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.568 249764 INFO nova.virt.libvirt.host [None req-c661b254-686e-49d1-96a6-76d7bc246c4a - - - - - -] Secure Boot support detected
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.569 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.569 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 09:32:23 compute-0 nova_compute[249760]: 2025-12-01 09:32:23.570 249764 DEBUG oslo_concurrency.lockutils [None req-f14a709e-8a00-433a-aeab-e61ebece9955 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 09:32:23 compute-0 virtqemud[250400]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 01 09:32:23 compute-0 virtqemud[250400]: hostname: compute-0
Dec 01 09:32:23 compute-0 virtqemud[250400]: End of file while reading data: Input/output error
Dec 01 09:32:23 compute-0 systemd[1]: libpod-a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8.scope: Deactivated successfully.
Dec 01 09:32:23 compute-0 systemd[1]: libpod-a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8.scope: Consumed 3.294s CPU time.
Dec 01 09:32:23 compute-0 podman[250647]: 2025-12-01 09:32:23.987350046 +0000 UTC m=+0.511325085 container died a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:32:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8-userdata-shm.mount: Deactivated successfully.
Dec 01 09:32:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49-merged.mount: Deactivated successfully.
Dec 01 09:32:24 compute-0 ceph-mon[75031]: pgmap v613: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:24 compute-0 podman[250647]: 2025-12-01 09:32:24.578100655 +0000 UTC m=+1.102075674 container cleanup a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 01 09:32:24 compute-0 podman[250647]: nova_compute
Dec 01 09:32:24 compute-0 podman[250677]: nova_compute
Dec 01 09:32:24 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 01 09:32:24 compute-0 systemd[1]: Stopped nova_compute container.
Dec 01 09:32:24 compute-0 systemd[1]: Starting nova_compute container...
Dec 01 09:32:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3d937cf4bc601078310909448bf482559071830be6dd775607e2c1c5141cb49/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:24 compute-0 podman[250690]: 2025-12-01 09:32:24.818631176 +0000 UTC m=+0.122525006 container init a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible)
Dec 01 09:32:24 compute-0 podman[250690]: 2025-12-01 09:32:24.832571148 +0000 UTC m=+0.136464908 container start a5b64ae738e496602a7521c597d5808e11b7ca434cbff9ed3330afa8f5c806f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 01 09:32:24 compute-0 nova_compute[250706]: + sudo -E kolla_set_configs
Dec 01 09:32:24 compute-0 podman[250690]: nova_compute
Dec 01 09:32:24 compute-0 systemd[1]: Started nova_compute container.
Dec 01 09:32:24 compute-0 sudo[250630]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Validating config file
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying service configuration files
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /etc/ceph
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Creating directory /etc/ceph
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Writing out command to execute
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:24 compute-0 nova_compute[250706]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 09:32:24 compute-0 nova_compute[250706]: ++ cat /run_command
Dec 01 09:32:24 compute-0 nova_compute[250706]: + CMD=nova-compute
Dec 01 09:32:24 compute-0 nova_compute[250706]: + ARGS=
Dec 01 09:32:24 compute-0 nova_compute[250706]: + sudo kolla_copy_cacerts
Dec 01 09:32:24 compute-0 nova_compute[250706]: + [[ ! -n '' ]]
Dec 01 09:32:24 compute-0 nova_compute[250706]: + . kolla_extend_start
Dec 01 09:32:24 compute-0 nova_compute[250706]: Running command: 'nova-compute'
Dec 01 09:32:24 compute-0 nova_compute[250706]: + echo 'Running command: '\''nova-compute'\'''
Dec 01 09:32:24 compute-0 nova_compute[250706]: + umask 0022
Dec 01 09:32:24 compute-0 nova_compute[250706]: + exec nova-compute
Dec 01 09:32:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:25 compute-0 sudo[250867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poiqpjqphpwdpwwdjoghkynyroizfkjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581545.105032-1566-94363760820238/AnsiballZ_podman_container.py'
Dec 01 09:32:25 compute-0 sudo[250867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:32:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v614: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:25 compute-0 python3.9[250869]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 01 09:32:25 compute-0 systemd[1]: Started libpod-conmon-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c.scope.
Dec 01 09:32:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:26 compute-0 podman[250896]: 2025-12-01 09:32:26.009586696 +0000 UTC m=+0.133604566 container init b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:32:26 compute-0 podman[250896]: 2025-12-01 09:32:26.016209576 +0000 UTC m=+0.140227426 container start b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 01 09:32:26 compute-0 python3.9[250869]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Applying nova statedir ownership
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 01 09:32:26 compute-0 nova_compute_init[250918]: INFO:nova_statedir:Nova statedir ownership complete
Dec 01 09:32:26 compute-0 systemd[1]: libpod-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c.scope: Deactivated successfully.
Dec 01 09:32:26 compute-0 podman[250932]: 2025-12-01 09:32:26.11120915 +0000 UTC m=+0.026115702 container died b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init)
Dec 01 09:32:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c-userdata-shm.mount: Deactivated successfully.
Dec 01 09:32:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-93a45807daa2df8bc0367a7472b5d2e2c9f0891beaa6ea83744c4fe907ed08e1-merged.mount: Deactivated successfully.
Dec 01 09:32:26 compute-0 podman[250932]: 2025-12-01 09:32:26.140961816 +0000 UTC m=+0.055868378 container cleanup b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 01 09:32:26 compute-0 systemd[1]: libpod-conmon-b54a93fad67812a69ed4ee78e6e21380d1f3fd415ea365cb91a6fd2bae05f81c.scope: Deactivated successfully.
Dec 01 09:32:26 compute-0 sudo[250867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:26 compute-0 ceph-mon[75031]: pgmap v614: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:26 compute-0 sshd-session[220732]: Connection closed by 192.168.122.30 port 54424
Dec 01 09:32:26 compute-0 sshd-session[220729]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:32:26 compute-0 systemd[1]: session-50.scope: Deactivated successfully.
Dec 01 09:32:26 compute-0 systemd[1]: session-50.scope: Consumed 2min 21.501s CPU time.
Dec 01 09:32:26 compute-0 systemd-logind[788]: Session 50 logged out. Waiting for processes to exit.
Dec 01 09:32:26 compute-0 systemd-logind[788]: Removed session 50.
Dec 01 09:32:26 compute-0 nova_compute[250706]: 2025-12-01 09:32:26.884 250710 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 09:32:26 compute-0 nova_compute[250706]: 2025-12-01 09:32:26.885 250710 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 09:32:26 compute-0 nova_compute[250706]: 2025-12-01 09:32:26.885 250710 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 09:32:26 compute-0 nova_compute[250706]: 2025-12-01 09:32:26.885 250710 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 01 09:32:26 compute-0 podman[250981]: 2025-12-01 09:32:26.937570929 +0000 UTC m=+0.066578097 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.021 250710 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.042 250710 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.043 250710 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.526 250710 INFO nova.virt.driver [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 01 09:32:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v615: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.626 250710 INFO nova.compute.provider_config [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.643 250710 DEBUG oslo_concurrency.lockutils [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.643 250710 DEBUG oslo_concurrency.lockutils [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.643 250710 DEBUG oslo_concurrency.lockutils [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.644 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.645 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.646 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.647 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.648 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.649 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.650 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.651 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.652 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.653 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.654 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.655 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.656 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.657 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.658 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.659 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.660 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.661 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.662 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.663 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.664 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.665 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.666 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.667 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.668 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.669 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.670 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.671 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.672 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.673 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.674 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.675 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.676 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.677 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.678 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.679 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.680 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.681 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.682 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.683 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.684 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.685 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.686 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.687 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.688 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.689 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.690 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.691 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.692 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.693 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.694 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.695 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.696 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.697 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.698 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.699 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.700 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.701 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.702 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.703 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.704 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.705 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.706 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.707 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.708 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.709 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.710 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.711 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.712 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.713 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.714 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.715 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.716 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.717 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.718 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 WARNING oslo_config.cfg [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 01 09:32:27 compute-0 nova_compute[250706]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 01 09:32:27 compute-0 nova_compute[250706]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 01 09:32:27 compute-0 nova_compute[250706]: and ``live_migration_inbound_addr`` respectively.
Dec 01 09:32:27 compute-0 nova_compute[250706]: ).  Its value may be silently ignored in the future.
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.719 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.720 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.721 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.722 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_secret_uuid        = 5620a9fb-e540-5250-a0e8-7aaad5347e3b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.723 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.724 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.725 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.726 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.727 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.728 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.729 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.730 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.731 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.732 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.733 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.734 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.735 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.736 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.737 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.738 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.739 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.740 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.741 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.742 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.743 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.744 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.745 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.746 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.747 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.748 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.749 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.750 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.751 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.752 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.753 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.754 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.755 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.756 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.757 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.758 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.759 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.760 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.761 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.762 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.763 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.764 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.765 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.766 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.767 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.768 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.769 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.770 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.771 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.772 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.773 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.774 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.775 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.776 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.777 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.778 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.779 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.780 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.781 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.782 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.783 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.784 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.785 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.786 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.787 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.788 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.788 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.788 250710 DEBUG oslo_service.service [None req-921360ba-a048-4185-bc73-2997dde11912 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.789 250710 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.806 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.806 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.806 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.807 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.829 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5359419580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.832 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5359419580> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.833 250710 INFO nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Connection event '1' reason 'None'
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.842 250710 INFO nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host capabilities <capabilities>
Dec 01 09:32:27 compute-0 nova_compute[250706]: 
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <host>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <uuid>52310927-1d30-4bda-9d2b-fd9f7cfadc4d</uuid>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <cpu>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <arch>x86_64</arch>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model>EPYC-Rome-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <vendor>AMD</vendor>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <microcode version='16777317'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <signature family='23' model='49' stepping='0'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='x2apic'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='tsc-deadline'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='osxsave'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='hypervisor'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='tsc_adjust'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='spec-ctrl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='stibp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='arch-capabilities'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='cmp_legacy'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='topoext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='virt-ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='lbrv'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='tsc-scale'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='vmcb-clean'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='pause-filter'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='pfthreshold'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='svme-addr-chk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='rdctl-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='skip-l1dfl-vmentry'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='mds-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature name='pschange-mc-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <pages unit='KiB' size='4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <pages unit='KiB' size='2048'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <pages unit='KiB' size='1048576'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </cpu>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <power_management>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <suspend_mem/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </power_management>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <iommu support='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <migration_features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <live/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <uri_transports>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <uri_transport>tcp</uri_transport>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <uri_transport>rdma</uri_transport>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </uri_transports>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </migration_features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <topology>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <cells num='1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <cell id='0'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           <memory unit='KiB'>7864320</memory>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           <pages unit='KiB' size='4'>1966080</pages>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           <pages unit='KiB' size='2048'>0</pages>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           <distances>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <sibling id='0' value='10'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           </distances>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           <cpus num='8'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:           </cpus>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         </cell>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </cells>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </topology>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <cache>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </cache>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <secmodel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model>selinux</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <doi>0</doi>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </secmodel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <secmodel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model>dac</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <doi>0</doi>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </secmodel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </host>
Dec 01 09:32:27 compute-0 nova_compute[250706]: 
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <guest>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <os_type>hvm</os_type>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <arch name='i686'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <wordsize>32</wordsize>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <domain type='qemu'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <domain type='kvm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </arch>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <pae/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <nonpae/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <acpi default='on' toggle='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <apic default='on' toggle='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <cpuselection/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <deviceboot/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <disksnapshot default='on' toggle='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <externalSnapshot/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </guest>
Dec 01 09:32:27 compute-0 nova_compute[250706]: 
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <guest>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <os_type>hvm</os_type>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <arch name='x86_64'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <wordsize>64</wordsize>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <domain type='qemu'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <domain type='kvm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </arch>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <acpi default='on' toggle='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <apic default='on' toggle='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <cpuselection/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <deviceboot/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <disksnapshot default='on' toggle='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <externalSnapshot/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </guest>
Dec 01 09:32:27 compute-0 nova_compute[250706]: 
Dec 01 09:32:27 compute-0 nova_compute[250706]: </capabilities>
Dec 01 09:32:27 compute-0 nova_compute[250706]: 
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.847 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.848 250710 WARNING nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.849 250710 DEBUG nova.virt.libvirt.volume.mount [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.852 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 01 09:32:27 compute-0 nova_compute[250706]: <domainCapabilities>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <domain>kvm</domain>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <arch>i686</arch>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <vcpu max='4096'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <iothreads supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <os supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <enum name='firmware'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <loader supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>rom</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pflash</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='readonly'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>yes</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='secure'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </loader>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </os>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <cpu>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='maximumMigratable'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <vendor>AMD</vendor>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='succor'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='custom' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-128'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-256'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-512'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='KnightsMill'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SierraForest'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='athlon'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='athlon-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='core2duo'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='core2duo-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='coreduo'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='coreduo-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='n270'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='n270-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='phenom'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='phenom-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </cpu>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <memoryBacking supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <enum name='sourceType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>file</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>anonymous</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>memfd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </memoryBacking>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <devices>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <disk supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='diskDevice'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>disk</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>cdrom</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>floppy</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>lun</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>fdc</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>sata</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <graphics supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vnc</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>egl-headless</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </graphics>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <video supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='modelType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vga</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>cirrus</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>none</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>bochs</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>ramfb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </video>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <hostdev supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='mode'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>subsystem</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='startupPolicy'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>mandatory</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>requisite</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>optional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='subsysType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pci</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='capsType'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='pciBackend'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </hostdev>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <rng supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>random</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>egd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </rng>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <filesystem supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='driverType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>path</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>handle</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtiofs</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </filesystem>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <tpm supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tpm-tis</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tpm-crb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>emulator</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>external</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendVersion'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>2.0</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </tpm>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <redirdev supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </redirdev>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <channel supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </channel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <crypto supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>qemu</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </crypto>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <interface supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>passt</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </interface>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <panic supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>isa</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>hyperv</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </panic>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <console supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>null</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vc</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>dev</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>file</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pipe</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>stdio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>udp</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tcp</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>qemu-vdagent</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </console>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </devices>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <gic supported='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <genid supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <backup supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <async-teardown supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <ps2 supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <sev supported='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <sgx supported='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <hyperv supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='features'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>relaxed</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vapic</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>spinlocks</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vpindex</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>runtime</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>synic</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>stimer</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>reset</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vendor_id</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>frequencies</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>reenlightenment</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tlbflush</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>ipi</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>avic</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>emsr_bitmap</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>xmm_input</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <defaults>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </defaults>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </hyperv>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <launchSecurity supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='sectype'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tdx</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </launchSecurity>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </features>
Dec 01 09:32:27 compute-0 nova_compute[250706]: </domainCapabilities>
Dec 01 09:32:27 compute-0 nova_compute[250706]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.858 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 01 09:32:27 compute-0 nova_compute[250706]: <domainCapabilities>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <domain>kvm</domain>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <arch>i686</arch>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <vcpu max='240'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <iothreads supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <os supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <enum name='firmware'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <loader supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>rom</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pflash</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='readonly'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>yes</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='secure'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </loader>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </os>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <cpu>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='maximumMigratable'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <vendor>AMD</vendor>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='succor'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='custom' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-128'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-256'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-512'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='KnightsMill'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SierraForest'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='athlon'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='athlon-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='core2duo'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='core2duo-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='coreduo'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='coreduo-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='n270'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='n270-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='phenom'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='phenom-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </cpu>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <memoryBacking supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <enum name='sourceType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>file</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>anonymous</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>memfd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </memoryBacking>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <devices>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <disk supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='diskDevice'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>disk</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>cdrom</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>floppy</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>lun</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>ide</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>fdc</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>sata</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <graphics supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vnc</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>egl-headless</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </graphics>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <video supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='modelType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vga</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>cirrus</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>none</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>bochs</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>ramfb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </video>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <hostdev supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='mode'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>subsystem</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='startupPolicy'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>mandatory</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>requisite</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>optional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='subsysType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pci</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='capsType'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='pciBackend'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </hostdev>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <rng supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>random</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>egd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </rng>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <filesystem supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='driverType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>path</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>handle</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>virtiofs</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </filesystem>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <tpm supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tpm-tis</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tpm-crb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>emulator</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>external</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendVersion'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>2.0</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </tpm>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <redirdev supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </redirdev>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <channel supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </channel>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <crypto supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>qemu</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </crypto>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <interface supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='backendType'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>passt</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </interface>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <panic supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>isa</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>hyperv</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </panic>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <console supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>null</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vc</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>dev</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>file</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pipe</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>stdio</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>udp</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tcp</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>qemu-vdagent</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </console>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </devices>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <features>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <gic supported='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <genid supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <backup supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <async-teardown supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <ps2 supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <sev supported='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <sgx supported='no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <hyperv supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='features'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>relaxed</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vapic</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>spinlocks</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vpindex</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>runtime</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>synic</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>stimer</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>reset</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>vendor_id</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>frequencies</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>reenlightenment</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tlbflush</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>ipi</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>avic</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>emsr_bitmap</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>xmm_input</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <defaults>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </defaults>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </hyperv>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <launchSecurity supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='sectype'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>tdx</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </launchSecurity>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </features>
Dec 01 09:32:27 compute-0 nova_compute[250706]: </domainCapabilities>
Dec 01 09:32:27 compute-0 nova_compute[250706]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.902 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 09:32:27 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.907 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 01 09:32:27 compute-0 nova_compute[250706]: <domainCapabilities>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <domain>kvm</domain>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <arch>x86_64</arch>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <vcpu max='4096'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <iothreads supported='yes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <os supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <enum name='firmware'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>efi</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <loader supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>rom</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>pflash</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='readonly'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>yes</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='secure'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>yes</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </loader>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   </os>
Dec 01 09:32:27 compute-0 nova_compute[250706]:   <cpu>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <enum name='maximumMigratable'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <vendor>AMD</vendor>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='succor'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:27 compute-0 nova_compute[250706]:     <mode name='custom' supported='yes'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Denverton-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='EPYC-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-128'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-256'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx10-512'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Haswell-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:27 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='KnightsMill'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SierraForest'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='athlon'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='athlon-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='core2duo'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='core2duo-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='coreduo'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='coreduo-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='n270'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='n270-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='phenom'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='phenom-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </cpu>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <memoryBacking supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <enum name='sourceType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <value>file</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <value>anonymous</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <value>memfd</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </memoryBacking>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <devices>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <disk supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='diskDevice'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>disk</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>cdrom</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>floppy</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>lun</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>fdc</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>sata</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <graphics supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vnc</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>egl-headless</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </graphics>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <video supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='modelType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vga</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>cirrus</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>none</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>bochs</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>ramfb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </video>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <hostdev supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='mode'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>subsystem</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='startupPolicy'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>mandatory</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>requisite</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>optional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='subsysType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pci</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='capsType'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='pciBackend'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </hostdev>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <rng supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>random</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>egd</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </rng>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <filesystem supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='driverType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>path</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>handle</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtiofs</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </filesystem>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <tpm supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tpm-tis</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tpm-crb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>emulator</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>external</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendVersion'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>2.0</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </tpm>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <redirdev supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </redirdev>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <channel supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </channel>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <crypto supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>qemu</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </crypto>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <interface supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>passt</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </interface>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <panic supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>isa</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>hyperv</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </panic>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <console supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>null</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vc</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>dev</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>file</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pipe</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>stdio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>udp</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tcp</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>qemu-vdagent</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </console>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </devices>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <features>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <gic supported='no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <genid supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <backup supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <async-teardown supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <ps2 supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <sev supported='no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <sgx supported='no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <hyperv supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='features'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>relaxed</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vapic</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>spinlocks</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vpindex</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>runtime</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>synic</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>stimer</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>reset</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vendor_id</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>frequencies</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>reenlightenment</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tlbflush</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>ipi</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>avic</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>emsr_bitmap</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>xmm_input</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <defaults>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </defaults>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </hyperv>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <launchSecurity supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='sectype'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tdx</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </launchSecurity>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </features>
Dec 01 09:32:28 compute-0 nova_compute[250706]: </domainCapabilities>
Dec 01 09:32:28 compute-0 nova_compute[250706]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:27.963 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 01 09:32:28 compute-0 nova_compute[250706]: <domainCapabilities>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <domain>kvm</domain>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <arch>x86_64</arch>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <vcpu max='240'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <iothreads supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <os supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <enum name='firmware'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <loader supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>rom</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pflash</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='readonly'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>yes</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='secure'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>no</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </loader>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </os>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <cpu>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <mode name='host-passthrough' supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='hostPassthroughMigratable'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <mode name='maximum' supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='maximumMigratable'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>on</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>off</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <mode name='host-model' supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <vendor>AMD</vendor>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='x2apic'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='hypervisor'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='stibp'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='ssbd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='overflow-recov'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='succor'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='ibrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='lbrv'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='tsc-scale'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='flushbyasid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='pause-filter'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='pfthreshold'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <feature policy='disable' name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <mode name='custom' supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Broadwell-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cooperlake'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Cooperlake-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Denverton'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Denverton-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Denverton-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Denverton-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Dhyana-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='auto-ibrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Milan-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amd-psfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='no-nested-data-bp'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='null-sel-clr-base'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='stibp-always-on'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-Rome-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='EPYC-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='GraniteRapids-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx10'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx10-128'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx10-256'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx10-512'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='prefetchiti'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Haswell-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v5'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v6'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Icelake-Server-v7'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='IvyBridge-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='KnightsMill'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='KnightsMill-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4fmaps'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-4vnniw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512er'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512pf'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G4-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Opteron_G5-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fma4'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tbm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xop'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SapphireRapids-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='amx-tile'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-bf16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-fp16'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512-vpopcntdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bitalg'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vbmi2'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrc'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fzrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='la57'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='taa-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='tsx-ldtrk'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xfd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SierraForest'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='SierraForest-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ifma'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-ne-convert'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx-vnni-int8'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='bus-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cmpccxadd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fbsdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='fsrs'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ibrs-all'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mcdt-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pbrsb-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='psdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='sbdr-ssdp-no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='serialize'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vaes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='vpclmulqdq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Client-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='hle'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='rtm'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Skylake-Server-v5'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512bw'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512cd'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512dq'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512f'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='avx512vl'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='invpcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pcid'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='pku'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='mpx'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v2'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v3'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='core-capability'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='split-lock-detect'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='Snowridge-v4'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='cldemote'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='erms'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='gfni'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdir64b'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='movdiri'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='xsaves'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='athlon'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='athlon-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='core2duo'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='core2duo-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='coreduo'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='coreduo-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='n270'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='n270-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='ss'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='phenom'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <blockers model='phenom-v1'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnow'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <feature name='3dnowext'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </blockers>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </mode>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </cpu>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <memoryBacking supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <enum name='sourceType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <value>file</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <value>anonymous</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <value>memfd</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </memoryBacking>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <devices>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <disk supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='diskDevice'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>disk</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>cdrom</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>floppy</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>lun</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>ide</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>fdc</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>sata</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <graphics supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vnc</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>egl-headless</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </graphics>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <video supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='modelType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vga</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>cirrus</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>none</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>bochs</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>ramfb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </video>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <hostdev supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='mode'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>subsystem</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='startupPolicy'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>mandatory</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>requisite</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>optional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='subsysType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pci</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>scsi</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='capsType'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='pciBackend'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </hostdev>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <rng supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtio-non-transitional</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>random</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>egd</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </rng>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <filesystem supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='driverType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>path</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>handle</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>virtiofs</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </filesystem>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <tpm supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tpm-tis</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tpm-crb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>emulator</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>external</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendVersion'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>2.0</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </tpm>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <redirdev supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='bus'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>usb</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </redirdev>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <channel supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </channel>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <crypto supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>qemu</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendModel'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>builtin</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </crypto>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <interface supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='backendType'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>default</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>passt</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </interface>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <panic supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='model'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>isa</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>hyperv</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </panic>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <console supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='type'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>null</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vc</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pty</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>dev</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>file</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>pipe</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>stdio</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>udp</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tcp</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>unix</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>qemu-vdagent</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>dbus</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </console>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </devices>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   <features>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <gic supported='no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <vmcoreinfo supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <genid supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <backingStoreInput supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <backup supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <async-teardown supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <ps2 supported='yes'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <sev supported='no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <sgx supported='no'/>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <hyperv supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='features'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>relaxed</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vapic</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>spinlocks</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vpindex</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>runtime</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>synic</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>stimer</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>reset</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>vendor_id</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>frequencies</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>reenlightenment</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tlbflush</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>ipi</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>avic</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>emsr_bitmap</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>xmm_input</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <defaults>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <spinlocks>4095</spinlocks>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <stimer_direct>on</stimer_direct>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </defaults>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </hyperv>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     <launchSecurity supported='yes'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       <enum name='sectype'>
Dec 01 09:32:28 compute-0 nova_compute[250706]:         <value>tdx</value>
Dec 01 09:32:28 compute-0 nova_compute[250706]:       </enum>
Dec 01 09:32:28 compute-0 nova_compute[250706]:     </launchSecurity>
Dec 01 09:32:28 compute-0 nova_compute[250706]:   </features>
Dec 01 09:32:28 compute-0 nova_compute[250706]: </domainCapabilities>
Dec 01 09:32:28 compute-0 nova_compute[250706]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.023 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.023 250710 INFO nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Secure Boot support detected
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.027 250710 INFO nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.028 250710 INFO nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.041 250710 DEBUG nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.086 250710 INFO nova.virt.node [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Determined node identity 847e3dbe-0f76-4032-a374-8c965945c22f from /var/lib/nova/compute_id
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.120 250710 WARNING nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Compute nodes ['847e3dbe-0f76-4032-a374-8c965945c22f'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.169 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.213 250710 WARNING nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.214 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.215 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.215 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.215 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.216 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:32:28 compute-0 ceph-mon[75031]: pgmap v615: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:32:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2759139554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:32:28 compute-0 nova_compute[250706]: 2025-12-01 09:32:28.688 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:32:28 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 01 09:32:28 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.059 250710 WARNING nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.060 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5314MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.061 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.061 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.085 250710 WARNING nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] No compute node record for compute-0.ctlplane.example.com:847e3dbe-0f76-4032-a374-8c965945c22f: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 847e3dbe-0f76-4032-a374-8c965945c22f could not be found.
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.120 250710 INFO nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 847e3dbe-0f76-4032-a374-8c965945c22f
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.220 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:32:29 compute-0 nova_compute[250706]: 2025-12-01 09:32:29.220 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:32:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v616: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:29 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2759139554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:32:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:30 compute-0 nova_compute[250706]: 2025-12-01 09:32:30.579 250710 INFO nova.scheduler.client.report [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [req-0f32e5bd-c8a2-45dd-8409-cb45e76985ee] Created resource provider record via placement API for resource provider with UUID 847e3dbe-0f76-4032-a374-8c965945c22f and name compute-0.ctlplane.example.com.
Dec 01 09:32:30 compute-0 ceph-mon[75031]: pgmap v616: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.019 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:32:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:32:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1601777968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.433 250710 DEBUG oslo_concurrency.processutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.441 250710 DEBUG nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 01 09:32:31 compute-0 nova_compute[250706]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.442 250710 INFO nova.virt.libvirt.host [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] kernel doesn't support AMD SEV
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.444 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.445 250710 DEBUG nova.virt.libvirt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.515 250710 DEBUG nova.scheduler.client.report [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updated inventory for provider 847e3dbe-0f76-4032-a374-8c965945c22f with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.515 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating resource provider 847e3dbe-0f76-4032-a374-8c965945c22f generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.516 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 09:32:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v617: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1601777968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.660 250710 DEBUG nova.compute.provider_tree [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Updating resource provider 847e3dbe-0f76-4032-a374-8c965945c22f generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.697 250710 DEBUG nova.compute.resource_tracker [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.698 250710 DEBUG oslo_concurrency.lockutils [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.698 250710 DEBUG nova.service [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.861 250710 DEBUG nova.service [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 01 09:32:31 compute-0 nova_compute[250706]: 2025-12-01 09:32:31.862 250710 DEBUG nova.servicegroup.drivers.db [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 01 09:32:32 compute-0 ceph-mon[75031]: pgmap v617: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v618: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:34 compute-0 ceph-mon[75031]: pgmap v618: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v619: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:36 compute-0 ceph-mon[75031]: pgmap v619: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v620: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:38 compute-0 ceph-mon[75031]: pgmap v620: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v621: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:40 compute-0 sudo[251091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:40 compute-0 sudo[251091]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:40 compute-0 sudo[251091]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:40 compute-0 sudo[251116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:32:40 compute-0 sudo[251116]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:40 compute-0 sudo[251116]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:40 compute-0 sudo[251141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:40 compute-0 sudo[251141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:40 compute-0 sudo[251141]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:40 compute-0 sudo[251166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:32:40 compute-0 sudo[251166]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:40 compute-0 ceph-mon[75031]: pgmap v621: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:40 compute-0 sudo[251166]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:32:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:32:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:32:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:32:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:32:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:32:40 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 16a85b5a-5546-429e-9698-a440e28dfd21 does not exist
Dec 01 09:32:40 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 82b4de4d-df1d-481e-8d7b-873fa8be56f6 does not exist
Dec 01 09:32:40 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 93e6d012-ce34-4722-b533-47eee0438b67 does not exist
Dec 01 09:32:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:32:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:32:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:32:40 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:32:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:32:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:32:40 compute-0 sudo[251221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:40 compute-0 sudo[251221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:40 compute-0 sudo[251221]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:41 compute-0 sudo[251246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:32:41 compute-0 sudo[251246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:41 compute-0 sudo[251246]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:41 compute-0 sudo[251271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:41 compute-0 sudo[251271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:41 compute-0 sudo[251271]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:41 compute-0 sudo[251296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:32:41 compute-0 sudo[251296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:41 compute-0 podman[251364]: 2025-12-01 09:32:41.504042615 +0000 UTC m=+0.048677612 container create b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:32:41 compute-0 systemd[1]: Started libpod-conmon-b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df.scope.
Dec 01 09:32:41 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:41 compute-0 podman[251364]: 2025-12-01 09:32:41.478035677 +0000 UTC m=+0.022670734 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:32:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v622: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:41 compute-0 podman[251364]: 2025-12-01 09:32:41.655413021 +0000 UTC m=+0.200048018 container init b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Dec 01 09:32:41 compute-0 podman[251364]: 2025-12-01 09:32:41.662856095 +0000 UTC m=+0.207491102 container start b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:32:41 compute-0 elastic_galileo[251380]: 167 167
Dec 01 09:32:41 compute-0 systemd[1]: libpod-b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df.scope: Deactivated successfully.
Dec 01 09:32:41 compute-0 podman[251364]: 2025-12-01 09:32:41.719471724 +0000 UTC m=+0.264106771 container attach b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:32:41 compute-0 podman[251364]: 2025-12-01 09:32:41.720570386 +0000 UTC m=+0.265205393 container died b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:32:41 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:32:41 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:32:41 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:32:41 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:32:41 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:32:41 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:32:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-6eb702f5ef26153352c77e49a46a32f0a382234047422cda2793bd06417e00e3-merged.mount: Deactivated successfully.
Dec 01 09:32:41 compute-0 podman[251364]: 2025-12-01 09:32:41.824589129 +0000 UTC m=+0.369224116 container remove b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:32:41 compute-0 systemd[1]: libpod-conmon-b291f1585ee72ca4d7104d2020c1098e730671f9b80f199eb1e4cbc4733c64df.scope: Deactivated successfully.
Dec 01 09:32:42 compute-0 podman[251406]: 2025-12-01 09:32:42.020071314 +0000 UTC m=+0.046750486 container create 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:32:42 compute-0 systemd[1]: Started libpod-conmon-31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1.scope.
Dec 01 09:32:42 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:42 compute-0 podman[251406]: 2025-12-01 09:32:42.004086764 +0000 UTC m=+0.030765966 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:32:42 compute-0 podman[251406]: 2025-12-01 09:32:42.118181447 +0000 UTC m=+0.144860649 container init 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:32:42 compute-0 podman[251406]: 2025-12-01 09:32:42.133997752 +0000 UTC m=+0.160676934 container start 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:32:42 compute-0 podman[251406]: 2025-12-01 09:32:42.137580945 +0000 UTC m=+0.164260127 container attach 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:32:42 compute-0 ceph-mon[75031]: pgmap v622: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:32:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:32:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:32:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:32:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:32:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:32:43 compute-0 stoic_lederberg[251423]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:32:43 compute-0 stoic_lederberg[251423]: --> relative data size: 1.0
Dec 01 09:32:43 compute-0 stoic_lederberg[251423]: --> All data devices are unavailable
Dec 01 09:32:43 compute-0 systemd[1]: libpod-31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1.scope: Deactivated successfully.
Dec 01 09:32:43 compute-0 podman[251406]: 2025-12-01 09:32:43.125912265 +0000 UTC m=+1.152591447 container died 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:32:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-db5e5010db24d6ad744a9dfbe556db4b62302db2ef5d62129a33f418899eecfd-merged.mount: Deactivated successfully.
Dec 01 09:32:43 compute-0 podman[251406]: 2025-12-01 09:32:43.182467302 +0000 UTC m=+1.209146484 container remove 31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_lederberg, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec 01 09:32:43 compute-0 systemd[1]: libpod-conmon-31ffd27441210749d442f14c23db0632fced7e8461acb67ee68550383cf9e7e1.scope: Deactivated successfully.
Dec 01 09:32:43 compute-0 sudo[251296]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:43 compute-0 sudo[251462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:43 compute-0 sudo[251462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:43 compute-0 sudo[251462]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:43 compute-0 sudo[251487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:32:43 compute-0 sudo[251487]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:43 compute-0 sudo[251487]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:43 compute-0 sudo[251512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:43 compute-0 sudo[251512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:43 compute-0 sudo[251512]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:43 compute-0 sudo[251537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:32:43 compute-0 sudo[251537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v623: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:43 compute-0 podman[251601]: 2025-12-01 09:32:43.895197071 +0000 UTC m=+0.043947856 container create 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:32:43 compute-0 systemd[1]: Started libpod-conmon-49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1.scope.
Dec 01 09:32:43 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:43 compute-0 podman[251601]: 2025-12-01 09:32:43.876271986 +0000 UTC m=+0.025022771 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:32:43 compute-0 podman[251601]: 2025-12-01 09:32:43.975036178 +0000 UTC m=+0.123786953 container init 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:32:43 compute-0 podman[251601]: 2025-12-01 09:32:43.98065259 +0000 UTC m=+0.129403345 container start 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:32:43 compute-0 hopeful_margulis[251617]: 167 167
Dec 01 09:32:43 compute-0 systemd[1]: libpod-49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1.scope: Deactivated successfully.
Dec 01 09:32:43 compute-0 podman[251601]: 2025-12-01 09:32:43.991346147 +0000 UTC m=+0.140096902 container attach 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:32:43 compute-0 podman[251601]: 2025-12-01 09:32:43.992886092 +0000 UTC m=+0.141636907 container died 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:32:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-08f47cab1a6ffa97d8a1054339c770b2d90061e6478b9a4153ac1fd1e9fb4ed6-merged.mount: Deactivated successfully.
Dec 01 09:32:44 compute-0 podman[251601]: 2025-12-01 09:32:44.030971568 +0000 UTC m=+0.179722333 container remove 49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_margulis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec 01 09:32:44 compute-0 systemd[1]: libpod-conmon-49ea2b32b64a9121aab14d28cdeb6f5066e4128f9935591863e32d5cbef791a1.scope: Deactivated successfully.
Dec 01 09:32:44 compute-0 podman[251634]: 2025-12-01 09:32:44.106679316 +0000 UTC m=+0.070185501 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 01 09:32:44 compute-0 podman[251661]: 2025-12-01 09:32:44.188614394 +0000 UTC m=+0.040010192 container create d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Dec 01 09:32:44 compute-0 systemd[1]: Started libpod-conmon-d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54.scope.
Dec 01 09:32:44 compute-0 podman[251661]: 2025-12-01 09:32:44.171833441 +0000 UTC m=+0.023229259 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:32:44 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:44 compute-0 podman[251661]: 2025-12-01 09:32:44.297550339 +0000 UTC m=+0.148946187 container init d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:32:44 compute-0 podman[251661]: 2025-12-01 09:32:44.303797558 +0000 UTC m=+0.155193396 container start d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:32:44 compute-0 podman[251661]: 2025-12-01 09:32:44.308867024 +0000 UTC m=+0.160262852 container attach d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:32:44 compute-0 ceph-mon[75031]: pgmap v623: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]: {
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:     "0": [
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:         {
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "devices": [
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "/dev/loop3"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             ],
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_name": "ceph_lv0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_size": "21470642176",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "name": "ceph_lv0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "tags": {
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cluster_name": "ceph",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.crush_device_class": "",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.encrypted": "0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osd_id": "0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.type": "block",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.vdo": "0"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             },
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "type": "block",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "vg_name": "ceph_vg0"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:         }
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:     ],
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:     "1": [
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:         {
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "devices": [
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "/dev/loop4"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             ],
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_name": "ceph_lv1",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_size": "21470642176",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "name": "ceph_lv1",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "tags": {
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cluster_name": "ceph",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.crush_device_class": "",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.encrypted": "0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osd_id": "1",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.type": "block",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.vdo": "0"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             },
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "type": "block",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "vg_name": "ceph_vg1"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:         }
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:     ],
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:     "2": [
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:         {
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "devices": [
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "/dev/loop5"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             ],
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_name": "ceph_lv2",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_size": "21470642176",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "name": "ceph_lv2",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "tags": {
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.cluster_name": "ceph",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.crush_device_class": "",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.encrypted": "0",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osd_id": "2",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.type": "block",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:                 "ceph.vdo": "0"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             },
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "type": "block",
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:             "vg_name": "ceph_vg2"
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:         }
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]:     ]
Dec 01 09:32:45 compute-0 priceless_mcclintock[251678]: }
Dec 01 09:32:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:45 compute-0 systemd[1]: libpod-d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54.scope: Deactivated successfully.
Dec 01 09:32:45 compute-0 podman[251661]: 2025-12-01 09:32:45.129862759 +0000 UTC m=+0.981258557 container died d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:32:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-06f7d588e0174f3d0d3e3e9617576f28b5b894b1ad05ed787b0cd126a800820b-merged.mount: Deactivated successfully.
Dec 01 09:32:45 compute-0 podman[251661]: 2025-12-01 09:32:45.180077474 +0000 UTC m=+1.031473272 container remove d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcclintock, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:32:45 compute-0 systemd[1]: libpod-conmon-d28e1e2361e2141fcde689c55392739135dff609e0bad9fc45c8b2938e630d54.scope: Deactivated successfully.
Dec 01 09:32:45 compute-0 sudo[251537]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:45 compute-0 sudo[251697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:45 compute-0 sudo[251697]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:45 compute-0 sudo[251697]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:45 compute-0 sudo[251722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:32:45 compute-0 sudo[251722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:45 compute-0 sudo[251722]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:45 compute-0 sudo[251747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:45 compute-0 sudo[251747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:45 compute-0 sudo[251747]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:45 compute-0 sudo[251772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:32:45 compute-0 sudo[251772]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v624: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:45 compute-0 podman[251835]: 2025-12-01 09:32:45.762230785 +0000 UTC m=+0.046455647 container create eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:32:45 compute-0 systemd[1]: Started libpod-conmon-eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf.scope.
Dec 01 09:32:45 compute-0 podman[251835]: 2025-12-01 09:32:45.738087061 +0000 UTC m=+0.022311953 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:32:45 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:45 compute-0 podman[251835]: 2025-12-01 09:32:45.850981509 +0000 UTC m=+0.135206351 container init eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:32:45 compute-0 podman[251835]: 2025-12-01 09:32:45.857556678 +0000 UTC m=+0.141781520 container start eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:32:45 compute-0 podman[251835]: 2025-12-01 09:32:45.860596516 +0000 UTC m=+0.144821378 container attach eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:32:45 compute-0 hardcore_wright[251852]: 167 167
Dec 01 09:32:45 compute-0 systemd[1]: libpod-eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf.scope: Deactivated successfully.
Dec 01 09:32:45 compute-0 podman[251835]: 2025-12-01 09:32:45.863997304 +0000 UTC m=+0.148222166 container died eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec 01 09:32:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-b9afd6fd5e072bf344966fd77833dabc81109e24757ce05368bd292c92c40138-merged.mount: Deactivated successfully.
Dec 01 09:32:45 compute-0 podman[251835]: 2025-12-01 09:32:45.9031497 +0000 UTC m=+0.187374572 container remove eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_wright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:32:45 compute-0 systemd[1]: libpod-conmon-eed833b937c90f7dc8eb3c90136e5b2fe0e2c616812da50f85819e32b14311bf.scope: Deactivated successfully.
Dec 01 09:32:46 compute-0 podman[251876]: 2025-12-01 09:32:46.102513057 +0000 UTC m=+0.047384244 container create d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:32:46 compute-0 systemd[1]: Started libpod-conmon-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope.
Dec 01 09:32:46 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:32:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:32:46 compute-0 podman[251876]: 2025-12-01 09:32:46.082874862 +0000 UTC m=+0.027746109 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:32:46 compute-0 podman[251876]: 2025-12-01 09:32:46.192714733 +0000 UTC m=+0.137585960 container init d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec 01 09:32:46 compute-0 podman[251876]: 2025-12-01 09:32:46.201782744 +0000 UTC m=+0.146653931 container start d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec 01 09:32:46 compute-0 podman[251876]: 2025-12-01 09:32:46.205105069 +0000 UTC m=+0.149976286 container attach d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Dec 01 09:32:46 compute-0 ceph-mon[75031]: pgmap v624: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:47 compute-0 gallant_greider[251892]: {
Dec 01 09:32:47 compute-0 gallant_greider[251892]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "osd_id": 0,
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "type": "bluestore"
Dec 01 09:32:47 compute-0 gallant_greider[251892]:     },
Dec 01 09:32:47 compute-0 gallant_greider[251892]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "osd_id": 1,
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "type": "bluestore"
Dec 01 09:32:47 compute-0 gallant_greider[251892]:     },
Dec 01 09:32:47 compute-0 gallant_greider[251892]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "osd_id": 2,
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:32:47 compute-0 gallant_greider[251892]:         "type": "bluestore"
Dec 01 09:32:47 compute-0 gallant_greider[251892]:     }
Dec 01 09:32:47 compute-0 gallant_greider[251892]: }
Dec 01 09:32:47 compute-0 systemd[1]: libpod-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope: Deactivated successfully.
Dec 01 09:32:47 compute-0 systemd[1]: libpod-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope: Consumed 1.008s CPU time.
Dec 01 09:32:47 compute-0 podman[251876]: 2025-12-01 09:32:47.206810103 +0000 UTC m=+1.151681300 container died d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:32:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-4df32bb0f4da53e35621a2cbc93460ba62c3c08bda2d6c971365066d7bd5cd08-merged.mount: Deactivated successfully.
Dec 01 09:32:47 compute-0 podman[251876]: 2025-12-01 09:32:47.265563034 +0000 UTC m=+1.210434221 container remove d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_greider, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:32:47 compute-0 systemd[1]: libpod-conmon-d97ed3305cf3692c38ada10aaeca982f5d7fc3136d7790135df0d16351ca4400.scope: Deactivated successfully.
Dec 01 09:32:47 compute-0 sudo[251772]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:32:47 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:32:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:32:47 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:32:47 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 628a28e3-921c-4801-990e-fce6753519ab does not exist
Dec 01 09:32:47 compute-0 sudo[251937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:32:47 compute-0 sudo[251937]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:47 compute-0 sudo[251937]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:47 compute-0 sudo[251962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:32:47 compute-0 sudo[251962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:32:47 compute-0 sudo[251962]: pam_unix(sudo:session): session closed for user root
Dec 01 09:32:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v625: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:48 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:32:48 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:32:49 compute-0 ceph-mon[75031]: pgmap v625: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v626: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:49 compute-0 nova_compute[250706]: 2025-12-01 09:32:49.864 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:32:49 compute-0 nova_compute[250706]: 2025-12-01 09:32:49.971 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:32:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:51 compute-0 ceph-mon[75031]: pgmap v626: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v627: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:53 compute-0 ceph-mon[75031]: pgmap v627: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v628: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:54 compute-0 podman[251987]: 2025-12-01 09:32:54.042806009 +0000 UTC m=+0.139385491 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:32:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:32:54 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/433414139' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:32:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:32:54 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/433414139' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:32:54 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/433414139' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:32:54 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/433414139' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:32:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:32:54 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/881557011' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:32:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:32:54 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/881557011' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:32:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:32:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:32:55 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2308178662' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:32:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:32:55 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2308178662' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:32:55 compute-0 ceph-mon[75031]: pgmap v628: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:55 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/881557011' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:32:55 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/881557011' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:32:55 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2308178662' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:32:55 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2308178662' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:32:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v629: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:57 compute-0 ceph-mon[75031]: pgmap v629: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v630: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:57 compute-0 podman[252013]: 2025-12-01 09:32:57.955497528 +0000 UTC m=+0.061976234 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:32:58 compute-0 ceph-mon[75031]: pgmap v630: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:32:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v631: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:00 compute-0 ceph-mon[75031]: pgmap v631: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v632: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:02 compute-0 ceph-mon[75031]: pgmap v632: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v633: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:04 compute-0 ceph-mon[75031]: pgmap v633: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v634: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:06 compute-0 ceph-mon[75031]: pgmap v634: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v635: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:08 compute-0 ceph-mon[75031]: pgmap v635: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v636: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:10 compute-0 ceph-mon[75031]: pgmap v636: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v637: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:12 compute-0 ceph-mon[75031]: pgmap v637: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:33:13
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['.mgr', 'cephfs.cephfs.meta', 'backups', 'images', 'cephfs.cephfs.data', 'vms', 'volumes']
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:33:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v638: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:14 compute-0 ceph-mon[75031]: pgmap v638: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:14 compute-0 podman[252033]: 2025-12-01 09:33:14.96009004 +0000 UTC m=+0.061470330 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 09:33:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v639: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:16 compute-0 ceph-mon[75031]: pgmap v639: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v640: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:33:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:33:18 compute-0 ceph-mon[75031]: pgmap v640: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v641: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:33:20.468 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:33:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:33:20.469 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:33:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:33:20.469 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:33:20 compute-0 ceph-mon[75031]: pgmap v641: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v642: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:22 compute-0 ceph-mon[75031]: pgmap v642: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v643: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:24 compute-0 ceph-mon[75031]: pgmap v643: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:25 compute-0 podman[252053]: 2025-12-01 09:33:25.024765403 +0000 UTC m=+0.120177420 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 09:33:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v644: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:26 compute-0 ceph-mon[75031]: pgmap v644: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.055 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.077 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.077 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.078 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.078 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.078 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.079 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.115 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.115 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.117 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.117 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:33:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:33:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/244408483' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:33:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v645: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.616 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:33:27 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/244408483' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.792 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.794 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5326MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.794 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:33:27 compute-0 nova_compute[250706]: 2025-12-01 09:33:27.794 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.022 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.022 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.062 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:33:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:33:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1698179103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.801 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.739s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.809 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.835 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.837 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:33:28 compute-0 nova_compute[250706]: 2025-12-01 09:33:28.837 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:33:28 compute-0 ceph-mon[75031]: pgmap v645: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:28 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1698179103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:33:28 compute-0 podman[252123]: 2025-12-01 09:33:28.963546412 +0000 UTC m=+0.065084014 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:33:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v646: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:30 compute-0 ceph-mon[75031]: pgmap v646: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v647: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:32 compute-0 ceph-mon[75031]: pgmap v647: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v648: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:35 compute-0 ceph-mon[75031]: pgmap v648: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v649: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:37 compute-0 ceph-mon[75031]: pgmap v649: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v650: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:39 compute-0 ceph-mon[75031]: pgmap v650: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v651: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #33. Immutable memtables: 0.
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.206612) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 33
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620206689, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 2023, "num_deletes": 505, "total_data_size": 1939274, "memory_usage": 1977720, "flush_reason": "Manual Compaction"}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #34: started
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620245115, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 34, "file_size": 1891888, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12009, "largest_seqno": 14031, "table_properties": {"data_size": 1883212, "index_size": 4854, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 19951, "raw_average_key_size": 18, "raw_value_size": 1863871, "raw_average_value_size": 1732, "num_data_blocks": 224, "num_entries": 1076, "num_filter_entries": 1076, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581425, "oldest_key_time": 1764581425, "file_creation_time": 1764581620, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 34, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 38563 microseconds, and 7801 cpu microseconds.
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.245187) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #34: 1891888 bytes OK
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.245218) [db/memtable_list.cc:519] [default] Level-0 commit table #34 started
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247038) [db/memtable_list.cc:722] [default] Level-0 commit table #34: memtable #1 done
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247054) EVENT_LOG_v1 {"time_micros": 1764581620247048, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247078) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1929720, prev total WAL file size 1946472, number of live WAL files 2.
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000030.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247815) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323531' seq:0, type:0; will stop at (end)
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [34(1847KB)], [32(4549KB)]
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620247875, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [34], "files_L6": [32], "score": -1, "input_data_size": 6550343, "oldest_snapshot_seqno": -1}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #35: 3252 keys, 5134531 bytes, temperature: kUnknown
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620301903, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 35, "file_size": 5134531, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5110952, "index_size": 14318, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8197, "raw_key_size": 77195, "raw_average_key_size": 23, "raw_value_size": 5050647, "raw_average_value_size": 1553, "num_data_blocks": 622, "num_entries": 3252, "num_filter_entries": 3252, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581620, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.302151) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 5134531 bytes
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.303653) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.1 rd, 94.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 4.4 +0.0 blob) out(4.9 +0.0 blob), read-write-amplify(6.2) write-amplify(2.7) OK, records in: 4275, records dropped: 1023 output_compression: NoCompression
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.303671) EVENT_LOG_v1 {"time_micros": 1764581620303662, "job": 14, "event": "compaction_finished", "compaction_time_micros": 54107, "compaction_time_cpu_micros": 33168, "output_level": 6, "num_output_files": 1, "total_output_size": 5134531, "num_input_records": 4275, "num_output_records": 3252, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000034.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620304125, "job": 14, "event": "table_file_deletion", "file_number": 34}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581620305233, "job": 14, "event": "table_file_deletion", "file_number": 32}
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.247682) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305376) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305385) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305388) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:33:40 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:33:40.305392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:33:41 compute-0 ceph-mon[75031]: pgmap v651: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v652: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:33:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:33:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:33:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:33:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:33:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:33:43 compute-0 ceph-mon[75031]: pgmap v652: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v653: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:45 compute-0 ceph-mon[75031]: pgmap v653: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v654: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:45 compute-0 podman[252142]: 2025-12-01 09:33:45.98275966 +0000 UTC m=+0.076912664 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Dec 01 09:33:47 compute-0 ceph-mon[75031]: pgmap v654: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:47 compute-0 sudo[252162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:47 compute-0 sudo[252162]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:47 compute-0 sudo[252162]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v655: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:47 compute-0 sudo[252187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:33:47 compute-0 sudo[252187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:47 compute-0 sudo[252187]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:47 compute-0 sudo[252212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:47 compute-0 sudo[252212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:47 compute-0 sudo[252212]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:47 compute-0 sudo[252237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Dec 01 09:33:47 compute-0 sudo[252237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:48 compute-0 sudo[252237]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:48 compute-0 sudo[252282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:48 compute-0 sudo[252282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:48 compute-0 sudo[252282]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:48 compute-0 sudo[252307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:33:48 compute-0 sudo[252307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:48 compute-0 sudo[252307]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:48 compute-0 sudo[252332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:48 compute-0 sudo[252332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:48 compute-0 sudo[252332]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:48 compute-0 sudo[252357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:33:48 compute-0 sudo[252357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:48 compute-0 sudo[252357]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "version", "format": "json"} v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3385313253' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:33:48 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14328 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 09:33:48 compute-0 ceph-mgr[75324]: [volumes INFO volumes.module] Starting _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 01 09:33:48 compute-0 ceph-mgr[75324]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:48 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev cf6f30bc-62b1-4f39-9170-8428315dad41 does not exist
Dec 01 09:33:48 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 922345fe-cd68-47fa-8d00-1c460b97c3ad does not exist
Dec 01 09:33:48 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 13e72bea-954e-4288-8864-bc0c63ae10b8 does not exist
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:33:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:33:48 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:33:48 compute-0 sudo[252414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:48 compute-0 sudo[252414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:48 compute-0 sudo[252414]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:49 compute-0 ceph-mon[75031]: pgmap v655: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/3385313253' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:33:49 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:33:49 compute-0 sudo[252439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:33:49 compute-0 sudo[252439]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:49 compute-0 sudo[252439]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:49 compute-0 sudo[252464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:49 compute-0 sudo[252464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:49 compute-0 sudo[252464]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:49 compute-0 sudo[252489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:33:49 compute-0 sudo[252489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v656: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:49 compute-0 podman[252552]: 2025-12-01 09:33:49.609089119 +0000 UTC m=+0.067902795 container create 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:33:49 compute-0 systemd[1]: Started libpod-conmon-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope.
Dec 01 09:33:49 compute-0 podman[252552]: 2025-12-01 09:33:49.581816914 +0000 UTC m=+0.040630670 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:33:49 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:33:49 compute-0 podman[252552]: 2025-12-01 09:33:49.712226847 +0000 UTC m=+0.171040553 container init 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:33:49 compute-0 podman[252552]: 2025-12-01 09:33:49.721345179 +0000 UTC m=+0.180158885 container start 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:33:49 compute-0 podman[252552]: 2025-12-01 09:33:49.72659771 +0000 UTC m=+0.185411426 container attach 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:33:49 compute-0 gallant_almeida[252568]: 167 167
Dec 01 09:33:49 compute-0 systemd[1]: libpod-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope: Deactivated successfully.
Dec 01 09:33:49 compute-0 conmon[252568]: conmon 207ee24a86bd3dee9c7d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope/container/memory.events
Dec 01 09:33:49 compute-0 podman[252573]: 2025-12-01 09:33:49.773347846 +0000 UTC m=+0.026232796 container died 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Dec 01 09:33:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-1cfcfafe77cc15f66767b2972a25715046507fc3aa71dc0698102596c9854407-merged.mount: Deactivated successfully.
Dec 01 09:33:49 compute-0 podman[252573]: 2025-12-01 09:33:49.812862063 +0000 UTC m=+0.065746993 container remove 207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_almeida, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:33:49 compute-0 systemd[1]: libpod-conmon-207ee24a86bd3dee9c7dea430682a34be20d07b8966d64bffa9898ae18bd5fa1.scope: Deactivated successfully.
Dec 01 09:33:49 compute-0 podman[252595]: 2025-12-01 09:33:49.971331873 +0000 UTC m=+0.039432646 container create f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:33:50 compute-0 systemd[1]: Started libpod-conmon-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope.
Dec 01 09:33:50 compute-0 ceph-mon[75031]: from='client.14328 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 09:33:50 compute-0 podman[252595]: 2025-12-01 09:33:49.952401668 +0000 UTC m=+0.020502491 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:33:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:33:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:50 compute-0 podman[252595]: 2025-12-01 09:33:50.078074594 +0000 UTC m=+0.146175467 container init f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:33:50 compute-0 podman[252595]: 2025-12-01 09:33:50.091178351 +0000 UTC m=+0.159279134 container start f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:33:50 compute-0 podman[252595]: 2025-12-01 09:33:50.095236338 +0000 UTC m=+0.163337161 container attach f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:33:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:51 compute-0 ceph-mon[75031]: pgmap v656: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:51 compute-0 gifted_yalow[252612]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:33:51 compute-0 gifted_yalow[252612]: --> relative data size: 1.0
Dec 01 09:33:51 compute-0 gifted_yalow[252612]: --> All data devices are unavailable
Dec 01 09:33:51 compute-0 systemd[1]: libpod-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope: Deactivated successfully.
Dec 01 09:33:51 compute-0 systemd[1]: libpod-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope: Consumed 1.011s CPU time.
Dec 01 09:33:51 compute-0 podman[252595]: 2025-12-01 09:33:51.158245907 +0000 UTC m=+1.226346720 container died f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:33:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-0081046c4980198874de97eb691a33c678aa58c11b36ebf7ef3ba3c82e0aada0-merged.mount: Deactivated successfully.
Dec 01 09:33:51 compute-0 podman[252595]: 2025-12-01 09:33:51.432911469 +0000 UTC m=+1.501012252 container remove f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_yalow, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:33:51 compute-0 systemd[1]: libpod-conmon-f218af5f1b001dd9eeb2e91636aee5a66901fb3402b9c6f62b181443819b5217.scope: Deactivated successfully.
Dec 01 09:33:51 compute-0 sudo[252489]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:51 compute-0 sudo[252654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:51 compute-0 sudo[252654]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:51 compute-0 sudo[252654]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:51 compute-0 sudo[252679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:33:51 compute-0 sudo[252679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:51 compute-0 sudo[252679]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v657: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:51 compute-0 sudo[252704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:51 compute-0 sudo[252704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:51 compute-0 sudo[252704]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:51 compute-0 sudo[252729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:33:51 compute-0 sudo[252729]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:52 compute-0 podman[252794]: 2025-12-01 09:33:52.069865868 +0000 UTC m=+0.050013950 container create 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:33:52 compute-0 systemd[1]: Started libpod-conmon-2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297.scope.
Dec 01 09:33:52 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:33:52 compute-0 podman[252794]: 2025-12-01 09:33:52.048057791 +0000 UTC m=+0.028205883 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:33:52 compute-0 podman[252794]: 2025-12-01 09:33:52.15159052 +0000 UTC m=+0.131738722 container init 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:33:52 compute-0 podman[252794]: 2025-12-01 09:33:52.160244939 +0000 UTC m=+0.140393021 container start 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:33:52 compute-0 podman[252794]: 2025-12-01 09:33:52.163924075 +0000 UTC m=+0.144072257 container attach 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:33:52 compute-0 confident_beaver[252810]: 167 167
Dec 01 09:33:52 compute-0 systemd[1]: libpod-2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297.scope: Deactivated successfully.
Dec 01 09:33:52 compute-0 podman[252794]: 2025-12-01 09:33:52.167122327 +0000 UTC m=+0.147270429 container died 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:33:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-5132fb86fd37443add15ceb550d92f191cd0ab8b857ef0c4bdb2b92a2a6d15f5-merged.mount: Deactivated successfully.
Dec 01 09:33:52 compute-0 podman[252794]: 2025-12-01 09:33:52.214470069 +0000 UTC m=+0.194618191 container remove 2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_beaver, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:33:52 compute-0 systemd[1]: libpod-conmon-2ee33a9facba95b0e2a3e43a7bc4ac1704800bee02ab1821b7f80df50ab7d297.scope: Deactivated successfully.
Dec 01 09:33:52 compute-0 podman[252833]: 2025-12-01 09:33:52.458502131 +0000 UTC m=+0.062893580 container create 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:33:52 compute-0 systemd[1]: Started libpod-conmon-8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae.scope.
Dec 01 09:33:52 compute-0 podman[252833]: 2025-12-01 09:33:52.435800038 +0000 UTC m=+0.040191487 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:33:52 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:33:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:52 compute-0 podman[252833]: 2025-12-01 09:33:52.554109732 +0000 UTC m=+0.158501161 container init 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:33:52 compute-0 podman[252833]: 2025-12-01 09:33:52.561224157 +0000 UTC m=+0.165615586 container start 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:33:52 compute-0 podman[252833]: 2025-12-01 09:33:52.564483861 +0000 UTC m=+0.168875320 container attach 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:33:53 compute-0 ceph-mon[75031]: pgmap v657: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]: {
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:     "0": [
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:         {
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "devices": [
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "/dev/loop3"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             ],
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_name": "ceph_lv0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_size": "21470642176",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "name": "ceph_lv0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "tags": {
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cluster_name": "ceph",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.crush_device_class": "",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.encrypted": "0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osd_id": "0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.type": "block",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.vdo": "0"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             },
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "type": "block",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "vg_name": "ceph_vg0"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:         }
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:     ],
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:     "1": [
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:         {
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "devices": [
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "/dev/loop4"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             ],
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_name": "ceph_lv1",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_size": "21470642176",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "name": "ceph_lv1",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "tags": {
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cluster_name": "ceph",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.crush_device_class": "",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.encrypted": "0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osd_id": "1",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.type": "block",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.vdo": "0"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             },
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "type": "block",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "vg_name": "ceph_vg1"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:         }
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:     ],
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:     "2": [
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:         {
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "devices": [
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "/dev/loop5"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             ],
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_name": "ceph_lv2",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_size": "21470642176",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "name": "ceph_lv2",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "tags": {
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.cluster_name": "ceph",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.crush_device_class": "",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.encrypted": "0",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osd_id": "2",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.type": "block",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:                 "ceph.vdo": "0"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             },
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "type": "block",
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:             "vg_name": "ceph_vg2"
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:         }
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]:     ]
Dec 01 09:33:53 compute-0 heuristic_hugle[252849]: }
Dec 01 09:33:53 compute-0 systemd[1]: libpod-8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae.scope: Deactivated successfully.
Dec 01 09:33:53 compute-0 podman[252833]: 2025-12-01 09:33:53.307922834 +0000 UTC m=+0.912314273 container died 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:33:53 compute-0 rsyslogd[1007]: imjournal from <np0005540741:heuristic_hugle>: begin to drop messages due to rate-limiting
Dec 01 09:33:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-8275c0984f5c1028e49633f9cd468d5738fe6ba121c8f76b5d9885c17522e761-merged.mount: Deactivated successfully.
Dec 01 09:33:53 compute-0 podman[252833]: 2025-12-01 09:33:53.374553011 +0000 UTC m=+0.978944430 container remove 8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_hugle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:33:53 compute-0 systemd[1]: libpod-conmon-8f2b80f71b3e039e47c7b3b35451f849b24a5e4e7c39a3cf856e6acdb45f8dae.scope: Deactivated successfully.
Dec 01 09:33:53 compute-0 sudo[252729]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:53 compute-0 sudo[252870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:53 compute-0 sudo[252870]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:53 compute-0 sudo[252870]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:53 compute-0 sudo[252895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:33:53 compute-0 sudo[252895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:53 compute-0 sudo[252895]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v658: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:53 compute-0 sudo[252920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:53 compute-0 sudo[252920]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:53 compute-0 sudo[252920]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:53 compute-0 sudo[252945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:33:53 compute-0 sudo[252945]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:54 compute-0 podman[253012]: 2025-12-01 09:33:54.007666309 +0000 UTC m=+0.053097029 container create f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_banach, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:33:54 compute-0 podman[253012]: 2025-12-01 09:33:53.981264409 +0000 UTC m=+0.026695209 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:33:54 compute-0 systemd[1]: Started libpod-conmon-f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e.scope.
Dec 01 09:33:54 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:33:54 compute-0 podman[253012]: 2025-12-01 09:33:54.132633455 +0000 UTC m=+0.178064195 container init f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_banach, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:33:54 compute-0 podman[253012]: 2025-12-01 09:33:54.138414632 +0000 UTC m=+0.183845362 container start f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_banach, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:33:54 compute-0 busy_banach[253029]: 167 167
Dec 01 09:33:54 compute-0 podman[253012]: 2025-12-01 09:33:54.141570182 +0000 UTC m=+0.187000902 container attach f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_banach, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:33:54 compute-0 systemd[1]: libpod-f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e.scope: Deactivated successfully.
Dec 01 09:33:54 compute-0 podman[253012]: 2025-12-01 09:33:54.143262531 +0000 UTC m=+0.188693271 container died f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_banach, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:33:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-276b7f3c3606946c382998aae18cb6a2c94af5ce3bd5809ad49d664d88807a7a-merged.mount: Deactivated successfully.
Dec 01 09:33:54 compute-0 podman[253012]: 2025-12-01 09:33:54.181906933 +0000 UTC m=+0.227337653 container remove f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_banach, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:33:54 compute-0 systemd[1]: libpod-conmon-f78f40664e7a6eb09de0354d44da7a09f74c4db793beae7bbb11c3c7f98fcb4e.scope: Deactivated successfully.
Dec 01 09:33:54 compute-0 podman[253053]: 2025-12-01 09:33:54.346565091 +0000 UTC m=+0.041350591 container create 1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_vaughan, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:33:54 compute-0 systemd[1]: Started libpod-conmon-1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9.scope.
Dec 01 09:33:54 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:33:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ad9a321e806b5a509635447afc29bda5b9da6d811edc332245cdc35052c1982/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ad9a321e806b5a509635447afc29bda5b9da6d811edc332245cdc35052c1982/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ad9a321e806b5a509635447afc29bda5b9da6d811edc332245cdc35052c1982/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ad9a321e806b5a509635447afc29bda5b9da6d811edc332245cdc35052c1982/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:33:54 compute-0 podman[253053]: 2025-12-01 09:33:54.417828552 +0000 UTC m=+0.112614082 container init 1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_vaughan, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:33:54 compute-0 podman[253053]: 2025-12-01 09:33:54.327122132 +0000 UTC m=+0.021907672 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:33:54 compute-0 podman[253053]: 2025-12-01 09:33:54.424196665 +0000 UTC m=+0.118982155 container start 1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_vaughan, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:33:54 compute-0 podman[253053]: 2025-12-01 09:33:54.427434028 +0000 UTC m=+0.122219568 container attach 1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_vaughan, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:33:55 compute-0 ceph-mon[75031]: pgmap v658: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:33:55 compute-0 kind_vaughan[253069]: {
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "osd_id": 0,
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "type": "bluestore"
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:     },
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "osd_id": 1,
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "type": "bluestore"
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:     },
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "osd_id": 2,
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:         "type": "bluestore"
Dec 01 09:33:55 compute-0 kind_vaughan[253069]:     }
Dec 01 09:33:55 compute-0 kind_vaughan[253069]: }
Dec 01 09:33:55 compute-0 systemd[1]: libpod-1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9.scope: Deactivated successfully.
Dec 01 09:33:55 compute-0 systemd[1]: libpod-1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9.scope: Consumed 1.032s CPU time.
Dec 01 09:33:55 compute-0 podman[253102]: 2025-12-01 09:33:55.500218497 +0000 UTC m=+0.032966059 container died 1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_vaughan, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:33:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-9ad9a321e806b5a509635447afc29bda5b9da6d811edc332245cdc35052c1982-merged.mount: Deactivated successfully.
Dec 01 09:33:55 compute-0 podman[253102]: 2025-12-01 09:33:55.572007803 +0000 UTC m=+0.104755325 container remove 1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_vaughan, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:33:55 compute-0 systemd[1]: libpod-conmon-1a7765c4218e36c80bf74b5762b98a2e525c6bcbed95ed30123f8d83450c52d9.scope: Deactivated successfully.
Dec 01 09:33:55 compute-0 podman[253103]: 2025-12-01 09:33:55.58440309 +0000 UTC m=+0.103890181 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 01 09:33:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v659: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:55 compute-0 sudo[252945]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:33:55 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:33:55 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:55 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 1c19971d-34be-45ab-bfc2-4e9fb21eaaac does not exist
Dec 01 09:33:55 compute-0 sudo[253144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:33:55 compute-0 sudo[253144]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:55 compute-0 sudo[253144]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:55 compute-0 sudo[253169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:33:55 compute-0 sudo[253169]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:33:55 compute-0 sudo[253169]: pam_unix(sudo:session): session closed for user root
Dec 01 09:33:56 compute-0 ceph-mon[75031]: pgmap v659: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:56 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:33:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v660: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:58 compute-0 ceph-mon[75031]: pgmap v660: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v661: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:33:59 compute-0 podman[253194]: 2025-12-01 09:33:59.964674412 +0000 UTC m=+0.064928449 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 09:34:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:00 compute-0 ceph-mon[75031]: pgmap v661: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v662: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:02 compute-0 ceph-mon[75031]: pgmap v662: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v663: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:04 compute-0 ceph-mon[75031]: pgmap v663: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v664: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:06 compute-0 ceph-mon[75031]: pgmap v664: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v665: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "version", "format": "json"} v 0) v1
Dec 01 09:34:08 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/912829570' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 09:34:08 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14330 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 09:34:08 compute-0 ceph-mgr[75324]: [volumes INFO volumes.module] Starting _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 01 09:34:08 compute-0 ceph-mgr[75324]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 01 09:34:08 compute-0 ceph-mon[75031]: pgmap v665: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:08 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/912829570' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 09:34:08 compute-0 ceph-mon[75031]: from='client.14330 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 09:34:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v666: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:34:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:34:10 compute-0 ceph-mon[75031]: pgmap v666: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v667: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:12 compute-0 ceph-mon[75031]: pgmap v667: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:34:13
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', '.mgr', 'cephfs.cephfs.meta', 'images', 'volumes', 'cephfs.cephfs.data', 'vms']
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:34:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v668: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:14 compute-0 ceph-mon[75031]: pgmap v668: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:34:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:34:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v669: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:16 compute-0 ceph-mon[75031]: pgmap v669: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:16 compute-0 podman[253213]: 2025-12-01 09:34:16.979823856 +0000 UTC m=+0.075278148 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible)
Dec 01 09:34:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v670: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:34:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:34:18 compute-0 ceph-mon[75031]: pgmap v670: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v671: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:34:20.469 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:34:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:34:20.470 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:34:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:34:20.470 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:34:20 compute-0 ceph-mon[75031]: pgmap v671: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v672: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:22 compute-0 ceph-mon[75031]: pgmap v672: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v673: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:34:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:34:24 compute-0 ceph-mon[75031]: pgmap v673: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v674: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:26 compute-0 podman[253233]: 2025-12-01 09:34:26.019691811 +0000 UTC m=+0.105548908 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:34:26 compute-0 ceph-mon[75031]: pgmap v674: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v675: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:28 compute-0 ceph-mon[75031]: pgmap v675: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.830 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.831 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.854 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.854 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.854 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.868 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.868 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.869 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.869 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.869 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.869 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.869 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.898 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.899 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.899 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.899 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:34:28 compute-0 nova_compute[250706]: 2025-12-01 09:34:28.899 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:34:28 compute-0 ceph-mgr[75324]: [devicehealth INFO root] Check health
Dec 01 09:34:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:34:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4120210584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.389 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.610 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:34:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v676: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.612 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5306MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.612 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.613 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.696 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.697 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:34:29 compute-0 nova_compute[250706]: 2025-12-01 09:34:29.719 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:34:29 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4120210584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:34:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:34:30 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3641530612' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:34:30 compute-0 nova_compute[250706]: 2025-12-01 09:34:30.200 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:34:30 compute-0 nova_compute[250706]: 2025-12-01 09:34:30.207 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:34:30 compute-0 nova_compute[250706]: 2025-12-01 09:34:30.228 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:34:30 compute-0 nova_compute[250706]: 2025-12-01 09:34:30.230 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:34:30 compute-0 nova_compute[250706]: 2025-12-01 09:34:30.230 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:34:30 compute-0 podman[253303]: 2025-12-01 09:34:30.294517031 +0000 UTC m=+0.080958241 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:34:30 compute-0 nova_compute[250706]: 2025-12-01 09:34:30.413 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:30 compute-0 nova_compute[250706]: 2025-12-01 09:34:30.414 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:34:30 compute-0 ceph-mon[75031]: pgmap v676: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:30 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3641530612' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:34:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v677: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:32 compute-0 ceph-mon[75031]: pgmap v677: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v678: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:34 compute-0 ceph-mon[75031]: pgmap v678: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v679: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:36 compute-0 ceph-mon[75031]: pgmap v679: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v680: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:38 compute-0 ceph-mon[75031]: pgmap v680: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v681: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:40 compute-0 ceph-mon[75031]: pgmap v681: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v682: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:42 compute-0 ceph-mon[75031]: pgmap v682: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:34:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:34:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:34:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:34:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:34:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:34:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v683: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:34:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1219757723' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:34:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:34:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1219757723' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:34:44 compute-0 ceph-mon[75031]: pgmap v683: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1219757723' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:34:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1219757723' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:34:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v684: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:45 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:34:45.796 159899 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:9e:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '66:a0:73:58:3b:fd'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 09:34:45 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:34:45.798 159899 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 09:34:45 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:34:45.800 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 09:34:46 compute-0 ceph-mon[75031]: pgmap v684: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v685: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:48 compute-0 podman[253324]: 2025-12-01 09:34:48.004819817 +0000 UTC m=+0.095534650 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 09:34:48 compute-0 ceph-mon[75031]: pgmap v685: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v686: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:50 compute-0 ceph-mon[75031]: pgmap v686: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v687: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:52 compute-0 ceph-mon[75031]: pgmap v687: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v688: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:55 compute-0 ceph-mon[75031]: pgmap v688: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:34:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v689: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:55 compute-0 sudo[253344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:34:55 compute-0 sudo[253344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:55 compute-0 sudo[253344]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:55 compute-0 sudo[253369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:34:55 compute-0 sudo[253369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:55 compute-0 sudo[253369]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:56 compute-0 sudo[253394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:34:56 compute-0 sudo[253394]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:56 compute-0 sudo[253394]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:56 compute-0 sudo[253419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:34:56 compute-0 sudo[253419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:56 compute-0 podman[253443]: 2025-12-01 09:34:56.233572762 +0000 UTC m=+0.119244733 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 09:34:56 compute-0 sudo[253419]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0) v1
Dec 01 09:34:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:34:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:34:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:34:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:34:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:34:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:34:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:34:56 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev b2e63266-5cf6-4c1e-bcdd-adbc7461b6a1 does not exist
Dec 01 09:34:56 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 85554432-6526-4188-964e-2ca722331d60 does not exist
Dec 01 09:34:56 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev fde50bb6-852d-4408-8a8a-a31be5c19a24 does not exist
Dec 01 09:34:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:34:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:34:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:34:56 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:34:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:34:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:34:56 compute-0 sudo[253500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:34:56 compute-0 sudo[253500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:56 compute-0 sudo[253500]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:56 compute-0 sudo[253525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:34:56 compute-0 sudo[253525]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:56 compute-0 sudo[253525]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:56 compute-0 sudo[253550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:34:56 compute-0 sudo[253550]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:56 compute-0 sudo[253550]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:56 compute-0 sudo[253575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:34:56 compute-0 sudo[253575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:57 compute-0 ceph-mon[75031]: pgmap v689: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:57 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:34:57 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:34:57 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:34:57 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:34:57 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:34:57 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:34:57 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:34:57 compute-0 podman[253640]: 2025-12-01 09:34:57.353311703 +0000 UTC m=+0.051313498 container create 67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhabha, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:34:57 compute-0 systemd[1]: Started libpod-conmon-67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e.scope.
Dec 01 09:34:57 compute-0 podman[253640]: 2025-12-01 09:34:57.327959753 +0000 UTC m=+0.025961578 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:34:57 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:34:57 compute-0 podman[253640]: 2025-12-01 09:34:57.461121855 +0000 UTC m=+0.159123660 container init 67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhabha, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:34:57 compute-0 podman[253640]: 2025-12-01 09:34:57.467063856 +0000 UTC m=+0.165065651 container start 67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhabha, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:34:57 compute-0 podman[253640]: 2025-12-01 09:34:57.470416903 +0000 UTC m=+0.168418708 container attach 67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhabha, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:34:57 compute-0 systemd[1]: libpod-67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e.scope: Deactivated successfully.
Dec 01 09:34:57 compute-0 admiring_bhabha[253656]: 167 167
Dec 01 09:34:57 compute-0 conmon[253656]: conmon 67fdd285346969041206 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e.scope/container/memory.events
Dec 01 09:34:57 compute-0 podman[253640]: 2025-12-01 09:34:57.474365846 +0000 UTC m=+0.172367631 container died 67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhabha, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:34:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-60f484de045632ed0900ae956a7f5f608e6bfbf5f97022e236092c24baaceba2-merged.mount: Deactivated successfully.
Dec 01 09:34:57 compute-0 podman[253640]: 2025-12-01 09:34:57.521140652 +0000 UTC m=+0.219142467 container remove 67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_bhabha, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:34:57 compute-0 systemd[1]: libpod-conmon-67fdd28534696904120644a46fe4b8d7f9058455d43c4984b5c60ed211f8d06e.scope: Deactivated successfully.
Dec 01 09:34:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v690: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:57 compute-0 podman[253680]: 2025-12-01 09:34:57.727758818 +0000 UTC m=+0.062216322 container create 9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_cray, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:34:57 compute-0 systemd[1]: Started libpod-conmon-9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac.scope.
Dec 01 09:34:57 compute-0 podman[253680]: 2025-12-01 09:34:57.698050853 +0000 UTC m=+0.032508417 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:34:57 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:34:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48f233c60e88161a1176d21745c39baf20708760fd1096a6c73b278a452767c1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:34:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48f233c60e88161a1176d21745c39baf20708760fd1096a6c73b278a452767c1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:34:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48f233c60e88161a1176d21745c39baf20708760fd1096a6c73b278a452767c1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:34:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48f233c60e88161a1176d21745c39baf20708760fd1096a6c73b278a452767c1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:34:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48f233c60e88161a1176d21745c39baf20708760fd1096a6c73b278a452767c1/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:34:57 compute-0 podman[253680]: 2025-12-01 09:34:57.832567674 +0000 UTC m=+0.167025178 container init 9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_cray, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:34:57 compute-0 podman[253680]: 2025-12-01 09:34:57.842843989 +0000 UTC m=+0.177301483 container start 9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_cray, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:34:57 compute-0 podman[253680]: 2025-12-01 09:34:57.84704346 +0000 UTC m=+0.181500944 container attach 9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_cray, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:34:58 compute-0 blissful_cray[253696]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:34:58 compute-0 blissful_cray[253696]: --> relative data size: 1.0
Dec 01 09:34:58 compute-0 blissful_cray[253696]: --> All data devices are unavailable
Dec 01 09:34:58 compute-0 systemd[1]: libpod-9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac.scope: Deactivated successfully.
Dec 01 09:34:58 compute-0 systemd[1]: libpod-9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac.scope: Consumed 1.102s CPU time.
Dec 01 09:34:58 compute-0 podman[253680]: 2025-12-01 09:34:58.983318347 +0000 UTC m=+1.317775821 container died 9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_cray, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:34:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-48f233c60e88161a1176d21745c39baf20708760fd1096a6c73b278a452767c1-merged.mount: Deactivated successfully.
Dec 01 09:34:59 compute-0 podman[253680]: 2025-12-01 09:34:59.041416859 +0000 UTC m=+1.375874323 container remove 9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_cray, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 01 09:34:59 compute-0 ceph-mon[75031]: pgmap v690: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:59 compute-0 systemd[1]: libpod-conmon-9eaa648db03afc7cc7983b84b7e5749bada17b196c5fe10ea14cd183c881daac.scope: Deactivated successfully.
Dec 01 09:34:59 compute-0 sudo[253575]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:59 compute-0 sudo[253740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:34:59 compute-0 sudo[253740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:59 compute-0 sudo[253740]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:59 compute-0 sudo[253765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:34:59 compute-0 sudo[253765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:59 compute-0 sudo[253765]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:59 compute-0 sudo[253790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:34:59 compute-0 sudo[253790]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:59 compute-0 sudo[253790]: pam_unix(sudo:session): session closed for user root
Dec 01 09:34:59 compute-0 sudo[253815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:34:59 compute-0 sudo[253815]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:34:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v691: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:34:59 compute-0 podman[253880]: 2025-12-01 09:34:59.721422055 +0000 UTC m=+0.039192108 container create d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_fermat, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:34:59 compute-0 systemd[1]: Started libpod-conmon-d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219.scope.
Dec 01 09:34:59 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:34:59 compute-0 podman[253880]: 2025-12-01 09:34:59.701881763 +0000 UTC m=+0.019651836 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:34:59 compute-0 podman[253880]: 2025-12-01 09:34:59.802140058 +0000 UTC m=+0.119910111 container init d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_fermat, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:34:59 compute-0 podman[253880]: 2025-12-01 09:34:59.81020058 +0000 UTC m=+0.127970633 container start d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_fermat, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 01 09:34:59 compute-0 boring_fermat[253896]: 167 167
Dec 01 09:34:59 compute-0 systemd[1]: libpod-d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219.scope: Deactivated successfully.
Dec 01 09:34:59 compute-0 podman[253880]: 2025-12-01 09:34:59.815722239 +0000 UTC m=+0.133492302 container attach d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_fermat, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:34:59 compute-0 podman[253880]: 2025-12-01 09:34:59.816584064 +0000 UTC m=+0.134354137 container died d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_fermat, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:34:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-1a00ff66b823dbb2fa0b5c3041021a85639739e919a8ac323ac05812e8c26f8b-merged.mount: Deactivated successfully.
Dec 01 09:34:59 compute-0 podman[253880]: 2025-12-01 09:34:59.851761546 +0000 UTC m=+0.169531599 container remove d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_fermat, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:34:59 compute-0 systemd[1]: libpod-conmon-d0047426d0064cc23fea79ce6e4e80e4a4a1584068963a4bde3bc9615cf9f219.scope: Deactivated successfully.
Dec 01 09:35:00 compute-0 podman[253920]: 2025-12-01 09:35:00.013322245 +0000 UTC m=+0.045231363 container create ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_galois, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:35:00 compute-0 systemd[1]: Started libpod-conmon-ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6.scope.
Dec 01 09:35:00 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:35:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f42d7ff89e3ed9e1a1a9b36a5a911ff7a7be88b3d7023711fbd7c89b3434d4f0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f42d7ff89e3ed9e1a1a9b36a5a911ff7a7be88b3d7023711fbd7c89b3434d4f0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f42d7ff89e3ed9e1a1a9b36a5a911ff7a7be88b3d7023711fbd7c89b3434d4f0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f42d7ff89e3ed9e1a1a9b36a5a911ff7a7be88b3d7023711fbd7c89b3434d4f0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:00 compute-0 podman[253920]: 2025-12-01 09:35:00.089912179 +0000 UTC m=+0.121821317 container init ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_galois, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:35:00 compute-0 podman[253920]: 2025-12-01 09:34:59.992588208 +0000 UTC m=+0.024497346 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:35:00 compute-0 podman[253920]: 2025-12-01 09:35:00.098199727 +0000 UTC m=+0.130108845 container start ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_galois, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:35:00 compute-0 podman[253920]: 2025-12-01 09:35:00.101876673 +0000 UTC m=+0.133785801 container attach ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_galois, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:35:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:00 compute-0 goofy_galois[253937]: {
Dec 01 09:35:00 compute-0 goofy_galois[253937]:     "0": [
Dec 01 09:35:00 compute-0 goofy_galois[253937]:         {
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "devices": [
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "/dev/loop3"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             ],
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_name": "ceph_lv0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_size": "21470642176",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "name": "ceph_lv0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "tags": {
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cluster_name": "ceph",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.crush_device_class": "",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.encrypted": "0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osd_id": "0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.type": "block",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.vdo": "0"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             },
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "type": "block",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "vg_name": "ceph_vg0"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:         }
Dec 01 09:35:00 compute-0 goofy_galois[253937]:     ],
Dec 01 09:35:00 compute-0 goofy_galois[253937]:     "1": [
Dec 01 09:35:00 compute-0 goofy_galois[253937]:         {
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "devices": [
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "/dev/loop4"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             ],
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_name": "ceph_lv1",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_size": "21470642176",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "name": "ceph_lv1",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "tags": {
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cluster_name": "ceph",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.crush_device_class": "",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.encrypted": "0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osd_id": "1",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.type": "block",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.vdo": "0"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             },
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "type": "block",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "vg_name": "ceph_vg1"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:         }
Dec 01 09:35:00 compute-0 goofy_galois[253937]:     ],
Dec 01 09:35:00 compute-0 goofy_galois[253937]:     "2": [
Dec 01 09:35:00 compute-0 goofy_galois[253937]:         {
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "devices": [
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "/dev/loop5"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             ],
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_name": "ceph_lv2",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_size": "21470642176",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "name": "ceph_lv2",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "tags": {
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.cluster_name": "ceph",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.crush_device_class": "",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.encrypted": "0",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osd_id": "2",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.type": "block",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:                 "ceph.vdo": "0"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             },
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "type": "block",
Dec 01 09:35:00 compute-0 goofy_galois[253937]:             "vg_name": "ceph_vg2"
Dec 01 09:35:00 compute-0 goofy_galois[253937]:         }
Dec 01 09:35:00 compute-0 goofy_galois[253937]:     ]
Dec 01 09:35:00 compute-0 goofy_galois[253937]: }
Dec 01 09:35:00 compute-0 systemd[1]: libpod-ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6.scope: Deactivated successfully.
Dec 01 09:35:00 compute-0 conmon[253937]: conmon ae6077d7a1a152e046e5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6.scope/container/memory.events
Dec 01 09:35:00 compute-0 podman[253920]: 2025-12-01 09:35:00.901453711 +0000 UTC m=+0.933362849 container died ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_galois, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:35:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-f42d7ff89e3ed9e1a1a9b36a5a911ff7a7be88b3d7023711fbd7c89b3434d4f0-merged.mount: Deactivated successfully.
Dec 01 09:35:00 compute-0 podman[253920]: 2025-12-01 09:35:00.968646415 +0000 UTC m=+1.000555533 container remove ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_galois, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:35:00 compute-0 podman[253946]: 2025-12-01 09:35:00.97159396 +0000 UTC m=+0.072875188 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 09:35:00 compute-0 systemd[1]: libpod-conmon-ae6077d7a1a152e046e5b9b09ec52dca27712867917601d69337f48f1aa228e6.scope: Deactivated successfully.
Dec 01 09:35:01 compute-0 sudo[253815]: pam_unix(sudo:session): session closed for user root
Dec 01 09:35:01 compute-0 ceph-mon[75031]: pgmap v691: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:01 compute-0 sudo[253974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:35:01 compute-0 sudo[253974]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:35:01 compute-0 sudo[253974]: pam_unix(sudo:session): session closed for user root
Dec 01 09:35:01 compute-0 sudo[253999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:35:01 compute-0 sudo[253999]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:35:01 compute-0 sudo[253999]: pam_unix(sudo:session): session closed for user root
Dec 01 09:35:01 compute-0 sudo[254024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:35:01 compute-0 sudo[254024]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:35:01 compute-0 sudo[254024]: pam_unix(sudo:session): session closed for user root
Dec 01 09:35:01 compute-0 sudo[254049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:35:01 compute-0 sudo[254049]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:35:01 compute-0 podman[254116]: 2025-12-01 09:35:01.614634164 +0000 UTC m=+0.043853143 container create 576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_moser, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:35:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v692: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:01 compute-0 systemd[1]: Started libpod-conmon-576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b.scope.
Dec 01 09:35:01 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:35:01 compute-0 podman[254116]: 2025-12-01 09:35:01.595767851 +0000 UTC m=+0.024986830 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:35:01 compute-0 podman[254116]: 2025-12-01 09:35:01.700955147 +0000 UTC m=+0.130174126 container init 576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_moser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:35:01 compute-0 podman[254116]: 2025-12-01 09:35:01.71006286 +0000 UTC m=+0.139281819 container start 576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_moser, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:35:01 compute-0 podman[254116]: 2025-12-01 09:35:01.713587831 +0000 UTC m=+0.142806810 container attach 576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_moser, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:35:01 compute-0 laughing_moser[254132]: 167 167
Dec 01 09:35:01 compute-0 systemd[1]: libpod-576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b.scope: Deactivated successfully.
Dec 01 09:35:01 compute-0 conmon[254132]: conmon 576edfa12d3e1dc1b45e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b.scope/container/memory.events
Dec 01 09:35:01 compute-0 podman[254116]: 2025-12-01 09:35:01.719588364 +0000 UTC m=+0.148807333 container died 576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_moser, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:35:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-7e60fc37e4d1219ce19fe63c7904bc4f69257b714ee4149290f5efc3b1d7bcd8-merged.mount: Deactivated successfully.
Dec 01 09:35:01 compute-0 podman[254116]: 2025-12-01 09:35:01.767880603 +0000 UTC m=+0.197099572 container remove 576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_moser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:35:01 compute-0 systemd[1]: libpod-conmon-576edfa12d3e1dc1b45e265ecf1eea23a8c43afcf32a09a43b55b7902ae4502b.scope: Deactivated successfully.
Dec 01 09:35:01 compute-0 podman[254153]: 2025-12-01 09:35:01.980868372 +0000 UTC m=+0.067240016 container create 93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_ganguly, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:35:02 compute-0 systemd[1]: Started libpod-conmon-93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f.scope.
Dec 01 09:35:02 compute-0 podman[254153]: 2025-12-01 09:35:01.954445462 +0000 UTC m=+0.040817126 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:35:02 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:35:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f45649913c75d1120d3aa25fb5e26c3b295d8c6deb25c137564107b7f9e1198/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f45649913c75d1120d3aa25fb5e26c3b295d8c6deb25c137564107b7f9e1198/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f45649913c75d1120d3aa25fb5e26c3b295d8c6deb25c137564107b7f9e1198/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f45649913c75d1120d3aa25fb5e26c3b295d8c6deb25c137564107b7f9e1198/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:35:02 compute-0 podman[254153]: 2025-12-01 09:35:02.109891765 +0000 UTC m=+0.196263459 container init 93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_ganguly, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Dec 01 09:35:02 compute-0 podman[254153]: 2025-12-01 09:35:02.120926622 +0000 UTC m=+0.207298226 container start 93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_ganguly, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:35:02 compute-0 podman[254153]: 2025-12-01 09:35:02.125878925 +0000 UTC m=+0.212250639 container attach 93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:35:03 compute-0 ceph-mon[75031]: pgmap v692: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]: {
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "osd_id": 0,
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "type": "bluestore"
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:     },
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "osd_id": 1,
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "type": "bluestore"
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:     },
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "osd_id": 2,
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:         "type": "bluestore"
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]:     }
Dec 01 09:35:03 compute-0 vigilant_ganguly[254170]: }
Dec 01 09:35:03 compute-0 systemd[1]: libpod-93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f.scope: Deactivated successfully.
Dec 01 09:35:03 compute-0 systemd[1]: libpod-93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f.scope: Consumed 1.060s CPU time.
Dec 01 09:35:03 compute-0 podman[254153]: 2025-12-01 09:35:03.173661494 +0000 UTC m=+1.260033138 container died 93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_ganguly, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:35:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f45649913c75d1120d3aa25fb5e26c3b295d8c6deb25c137564107b7f9e1198-merged.mount: Deactivated successfully.
Dec 01 09:35:03 compute-0 podman[254153]: 2025-12-01 09:35:03.227094902 +0000 UTC m=+1.313466506 container remove 93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_ganguly, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:35:03 compute-0 systemd[1]: libpod-conmon-93da24b5068d6c6dafe22185fc73877896437fe1ad474e0b29c47128a6b7f48f.scope: Deactivated successfully.
Dec 01 09:35:03 compute-0 sudo[254049]: pam_unix(sudo:session): session closed for user root
Dec 01 09:35:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:35:03 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:35:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:35:03 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:35:03 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 24bf5ea5-9d19-4667-b3fd-74b3e74f1da5 does not exist
Dec 01 09:35:03 compute-0 sudo[254216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:35:03 compute-0 sudo[254216]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:35:03 compute-0 sudo[254216]: pam_unix(sudo:session): session closed for user root
Dec 01 09:35:03 compute-0 sudo[254241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:35:03 compute-0 sudo[254241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:35:03 compute-0 sudo[254241]: pam_unix(sudo:session): session closed for user root
Dec 01 09:35:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v693: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:35:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:35:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:05 compute-0 ceph-mon[75031]: pgmap v693: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v694: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:07 compute-0 ceph-mon[75031]: pgmap v694: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v695: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:09 compute-0 ceph-mon[75031]: pgmap v695: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v696: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:11 compute-0 ceph-mon[75031]: pgmap v696: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v697: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:35:13
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['images', 'volumes', 'cephfs.cephfs.meta', 'vms', 'backups', 'cephfs.cephfs.data', '.mgr']
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:35:13 compute-0 ceph-mon[75031]: pgmap v697: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v698: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:15 compute-0 ceph-mon[75031]: pgmap v698: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v699: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:17 compute-0 ceph-mon[75031]: pgmap v699: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v700: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:35:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:35:18 compute-0 podman[254266]: 2025-12-01 09:35:18.996861399 +0000 UTC m=+0.090224387 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:35:19 compute-0 ceph-mon[75031]: pgmap v700: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v701: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:35:20.471 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:35:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:35:20.471 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:35:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:35:20.472 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:35:21 compute-0 ceph-mon[75031]: pgmap v701: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v702: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:23 compute-0 ceph-mon[75031]: pgmap v702: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v703: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:25 compute-0 ceph-mon[75031]: pgmap v703: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v704: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:27 compute-0 podman[254286]: 2025-12-01 09:35:27.075263766 +0000 UTC m=+0.166319666 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:35:27 compute-0 ceph-mon[75031]: pgmap v704: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v705: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.048 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.051 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.051 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.075 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.076 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.076 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:28 compute-0 nova_compute[250706]: 2025-12-01 09:35:28.076 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.087 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.087 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.088 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.088 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.088 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:35:29 compute-0 ceph-mon[75031]: pgmap v705: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:35:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1296109139' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:35:29 compute-0 nova_compute[250706]: 2025-12-01 09:35:29.613 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:35:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v706: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.022 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.023 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5310MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.023 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.024 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:35:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.373 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.374 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.393 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:35:30 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1296109139' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:35:30 compute-0 ceph-mon[75031]: pgmap v706: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:35:30 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3376002074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.864 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:35:30 compute-0 nova_compute[250706]: 2025-12-01 09:35:30.871 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:35:31 compute-0 nova_compute[250706]: 2025-12-01 09:35:31.224 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:35:31 compute-0 nova_compute[250706]: 2025-12-01 09:35:31.227 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:35:31 compute-0 nova_compute[250706]: 2025-12-01 09:35:31.227 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:35:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3376002074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:35:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v707: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:31 compute-0 podman[254357]: 2025-12-01 09:35:31.984016297 +0000 UTC m=+0.076507443 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 09:35:32 compute-0 nova_compute[250706]: 2025-12-01 09:35:32.227 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:32 compute-0 nova_compute[250706]: 2025-12-01 09:35:32.228 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:35:32 compute-0 ceph-mon[75031]: pgmap v707: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v708: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:34 compute-0 ceph-mon[75031]: pgmap v708: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v709: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:36 compute-0 ceph-mon[75031]: pgmap v709: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v710: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:38 compute-0 ceph-mon[75031]: pgmap v710: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v711: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:40 compute-0 ceph-mon[75031]: pgmap v711: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v712: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:42 compute-0 ceph-mon[75031]: pgmap v712: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:35:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:35:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:35:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:35:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:35:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:35:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v713: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:35:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3536077626' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:35:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:35:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3536077626' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:35:44 compute-0 ceph-mon[75031]: pgmap v713: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/3536077626' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:35:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/3536077626' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:35:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v714: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:46 compute-0 ceph-mon[75031]: pgmap v714: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v715: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:48 compute-0 ceph-mon[75031]: pgmap v715: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v716: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:49 compute-0 podman[254377]: 2025-12-01 09:35:49.968931337 +0000 UTC m=+0.076540304 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:35:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:50 compute-0 ceph-mon[75031]: pgmap v716: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v717: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:52 compute-0 ceph-mon[75031]: pgmap v717: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v718: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:54 compute-0 ceph-mon[75031]: pgmap v718: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:35:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v719: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:56 compute-0 ceph-mon[75031]: pgmap v719: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v720: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:58 compute-0 podman[254398]: 2025-12-01 09:35:58.065842598 +0000 UTC m=+0.157001229 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:35:58 compute-0 ceph-mon[75031]: pgmap v720: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:35:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v721: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:00 compute-0 ceph-mon[75031]: pgmap v721: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v722: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:02 compute-0 ceph-mon[75031]: pgmap v722: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:02 compute-0 podman[254425]: 2025-12-01 09:36:02.953405948 +0000 UTC m=+0.056620481 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 01 09:36:03 compute-0 sudo[254444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:03 compute-0 sudo[254444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:03 compute-0 sudo[254444]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:03 compute-0 sudo[254469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:36:03 compute-0 sudo[254469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:03 compute-0 sudo[254469]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v723: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:03 compute-0 sudo[254494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:03 compute-0 sudo[254494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:03 compute-0 sudo[254494]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:03 compute-0 sudo[254519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:36:03 compute-0 sudo[254519]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:04 compute-0 sudo[254519]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:36:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:36:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:36:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:36:04 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 5847a9e0-ec55-4520-b9f0-94b8497b55b9 does not exist
Dec 01 09:36:04 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev f5a63434-ec1a-4735-bc00-ba1d613bc515 does not exist
Dec 01 09:36:04 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev c98bda81-4117-4dc9-a8ce-ac1adb22641a does not exist
Dec 01 09:36:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:36:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:36:04 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:36:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:36:04 compute-0 sudo[254575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:04 compute-0 sudo[254575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:04 compute-0 sudo[254575]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:04 compute-0 sudo[254600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:36:04 compute-0 sudo[254600]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:04 compute-0 sudo[254600]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:04 compute-0 sudo[254625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:04 compute-0 sudo[254625]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:04 compute-0 sudo[254625]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:04 compute-0 sudo[254650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:36:04 compute-0 sudo[254650]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:04 compute-0 ceph-mon[75031]: pgmap v723: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:36:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:36:04 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:36:05 compute-0 podman[254717]: 2025-12-01 09:36:05.085977922 +0000 UTC m=+0.054927851 container create aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_aryabhata, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:36:05 compute-0 systemd[1]: Started libpod-conmon-aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68.scope.
Dec 01 09:36:05 compute-0 podman[254717]: 2025-12-01 09:36:05.067763648 +0000 UTC m=+0.036713597 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:36:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:05 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:36:05 compute-0 podman[254717]: 2025-12-01 09:36:05.211248437 +0000 UTC m=+0.180198396 container init aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_aryabhata, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:36:05 compute-0 podman[254717]: 2025-12-01 09:36:05.218815015 +0000 UTC m=+0.187764954 container start aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_aryabhata, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:36:05 compute-0 podman[254717]: 2025-12-01 09:36:05.222723477 +0000 UTC m=+0.191673426 container attach aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_aryabhata, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:36:05 compute-0 sweet_aryabhata[254733]: 167 167
Dec 01 09:36:05 compute-0 systemd[1]: libpod-aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68.scope: Deactivated successfully.
Dec 01 09:36:05 compute-0 podman[254717]: 2025-12-01 09:36:05.225887678 +0000 UTC m=+0.194837617 container died aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_aryabhata, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:36:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-f680366d05d9d74d3610136c4b7b6ad24663498ba5975aa8421edcc139c1b8b6-merged.mount: Deactivated successfully.
Dec 01 09:36:05 compute-0 podman[254717]: 2025-12-01 09:36:05.283674991 +0000 UTC m=+0.252624960 container remove aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_aryabhata, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:36:05 compute-0 systemd[1]: libpod-conmon-aa78ec773c820c0bd077fbbefc09dc14e5ded22995bc0c59821fa6f91c6abf68.scope: Deactivated successfully.
Dec 01 09:36:05 compute-0 podman[254758]: 2025-12-01 09:36:05.463451814 +0000 UTC m=+0.040863147 container create b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_faraday, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:36:05 compute-0 systemd[1]: Started libpod-conmon-b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60.scope.
Dec 01 09:36:05 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:36:05 compute-0 podman[254758]: 2025-12-01 09:36:05.446536767 +0000 UTC m=+0.023948120 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:36:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ed33b69d982d6045ac265e24d9ce5f43bd9106081b53d914538e6f7ce45509b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ed33b69d982d6045ac265e24d9ce5f43bd9106081b53d914538e6f7ce45509b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ed33b69d982d6045ac265e24d9ce5f43bd9106081b53d914538e6f7ce45509b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ed33b69d982d6045ac265e24d9ce5f43bd9106081b53d914538e6f7ce45509b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ed33b69d982d6045ac265e24d9ce5f43bd9106081b53d914538e6f7ce45509b/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:05 compute-0 podman[254758]: 2025-12-01 09:36:05.559995122 +0000 UTC m=+0.137406475 container init b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_faraday, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:36:05 compute-0 podman[254758]: 2025-12-01 09:36:05.57241579 +0000 UTC m=+0.149827123 container start b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_faraday, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:36:05 compute-0 podman[254758]: 2025-12-01 09:36:05.575659393 +0000 UTC m=+0.153070726 container attach b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_faraday, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:36:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v724: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:06 compute-0 bold_faraday[254774]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:36:06 compute-0 bold_faraday[254774]: --> relative data size: 1.0
Dec 01 09:36:06 compute-0 bold_faraday[254774]: --> All data devices are unavailable
Dec 01 09:36:06 compute-0 systemd[1]: libpod-b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60.scope: Deactivated successfully.
Dec 01 09:36:06 compute-0 systemd[1]: libpod-b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60.scope: Consumed 1.117s CPU time.
Dec 01 09:36:06 compute-0 podman[254758]: 2025-12-01 09:36:06.719183318 +0000 UTC m=+1.296594661 container died b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_faraday, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:36:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-3ed33b69d982d6045ac265e24d9ce5f43bd9106081b53d914538e6f7ce45509b-merged.mount: Deactivated successfully.
Dec 01 09:36:06 compute-0 podman[254758]: 2025-12-01 09:36:06.779727031 +0000 UTC m=+1.357138374 container remove b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_faraday, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:36:06 compute-0 systemd[1]: libpod-conmon-b04e68f039fc2d131407b84f3d27a46a1bd4a868fe034e016166ad50b23e8e60.scope: Deactivated successfully.
Dec 01 09:36:06 compute-0 sudo[254650]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:06 compute-0 ceph-mon[75031]: pgmap v724: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:06 compute-0 sudo[254816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:06 compute-0 sudo[254816]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:06 compute-0 sudo[254816]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:07 compute-0 sudo[254841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:36:07 compute-0 sudo[254841]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:07 compute-0 sudo[254841]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:07 compute-0 sudo[254866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:07 compute-0 sudo[254866]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:07 compute-0 sudo[254866]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:07 compute-0 sudo[254891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:36:07 compute-0 sudo[254891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:07 compute-0 podman[254957]: 2025-12-01 09:36:07.585206568 +0000 UTC m=+0.069824261 container create ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_raman, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:36:07 compute-0 systemd[1]: Started libpod-conmon-ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2.scope.
Dec 01 09:36:07 compute-0 podman[254957]: 2025-12-01 09:36:07.557397498 +0000 UTC m=+0.042015251 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:36:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v725: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:07 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:36:07 compute-0 podman[254957]: 2025-12-01 09:36:07.699047033 +0000 UTC m=+0.183664736 container init ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_raman, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Dec 01 09:36:07 compute-0 podman[254957]: 2025-12-01 09:36:07.709570226 +0000 UTC m=+0.194187939 container start ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_raman, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:36:07 compute-0 podman[254957]: 2025-12-01 09:36:07.714417366 +0000 UTC m=+0.199035139 container attach ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_raman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:36:07 compute-0 systemd[1]: libpod-ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2.scope: Deactivated successfully.
Dec 01 09:36:07 compute-0 pedantic_raman[254973]: 167 167
Dec 01 09:36:07 compute-0 conmon[254973]: conmon ef6e12df964b3ecdb70d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2.scope/container/memory.events
Dec 01 09:36:07 compute-0 podman[254957]: 2025-12-01 09:36:07.716439794 +0000 UTC m=+0.201057507 container died ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_raman, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:36:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-be274148d1acaf33eeb5395b1ec23eee726105bc8a6b52ff8afa46f53a218e22-merged.mount: Deactivated successfully.
Dec 01 09:36:07 compute-0 podman[254957]: 2025-12-01 09:36:07.764773325 +0000 UTC m=+0.249390988 container remove ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_raman, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:36:07 compute-0 systemd[1]: libpod-conmon-ef6e12df964b3ecdb70d3929c7c76e01665029cf59ff1181a4f12af32fbe49a2.scope: Deactivated successfully.
Dec 01 09:36:08 compute-0 podman[254996]: 2025-12-01 09:36:08.03043973 +0000 UTC m=+0.076092401 container create f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_mcclintock, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:36:08 compute-0 systemd[1]: Started libpod-conmon-f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be.scope.
Dec 01 09:36:08 compute-0 podman[254996]: 2025-12-01 09:36:07.999770257 +0000 UTC m=+0.045422978 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:36:08 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:36:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a3965ef62bba1c9faea59ea4841998b42e490c95d68f16aef73b493572238b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a3965ef62bba1c9faea59ea4841998b42e490c95d68f16aef73b493572238b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a3965ef62bba1c9faea59ea4841998b42e490c95d68f16aef73b493572238b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9a3965ef62bba1c9faea59ea4841998b42e490c95d68f16aef73b493572238b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:08 compute-0 podman[254996]: 2025-12-01 09:36:08.12532364 +0000 UTC m=+0.170976241 container init f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_mcclintock, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:36:08 compute-0 podman[254996]: 2025-12-01 09:36:08.138081287 +0000 UTC m=+0.183733878 container start f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_mcclintock, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:36:08 compute-0 podman[254996]: 2025-12-01 09:36:08.145038207 +0000 UTC m=+0.190690818 container attach f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_mcclintock, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]: {
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:     "0": [
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:         {
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "devices": [
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "/dev/loop3"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             ],
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_name": "ceph_lv0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_size": "21470642176",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "name": "ceph_lv0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "tags": {
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cluster_name": "ceph",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.crush_device_class": "",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.encrypted": "0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osd_id": "0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.type": "block",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.vdo": "0"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             },
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "type": "block",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "vg_name": "ceph_vg0"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:         }
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:     ],
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:     "1": [
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:         {
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "devices": [
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "/dev/loop4"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             ],
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_name": "ceph_lv1",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_size": "21470642176",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "name": "ceph_lv1",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "tags": {
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cluster_name": "ceph",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.crush_device_class": "",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.encrypted": "0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osd_id": "1",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.type": "block",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.vdo": "0"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             },
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "type": "block",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "vg_name": "ceph_vg1"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:         }
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:     ],
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:     "2": [
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:         {
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "devices": [
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "/dev/loop5"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             ],
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_name": "ceph_lv2",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_size": "21470642176",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "name": "ceph_lv2",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "tags": {
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.cluster_name": "ceph",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.crush_device_class": "",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.encrypted": "0",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osd_id": "2",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.type": "block",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:                 "ceph.vdo": "0"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             },
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "type": "block",
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:             "vg_name": "ceph_vg2"
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:         }
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]:     ]
Dec 01 09:36:08 compute-0 nifty_mcclintock[255012]: }
Dec 01 09:36:08 compute-0 systemd[1]: libpod-f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be.scope: Deactivated successfully.
Dec 01 09:36:08 compute-0 podman[254996]: 2025-12-01 09:36:08.909026731 +0000 UTC m=+0.954679352 container died f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_mcclintock, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:36:08 compute-0 ceph-mon[75031]: pgmap v725: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-f9a3965ef62bba1c9faea59ea4841998b42e490c95d68f16aef73b493572238b-merged.mount: Deactivated successfully.
Dec 01 09:36:08 compute-0 podman[254996]: 2025-12-01 09:36:08.971123508 +0000 UTC m=+1.016776099 container remove f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_mcclintock, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:36:08 compute-0 systemd[1]: libpod-conmon-f9a8e17c420fecaf2714326a45034266054b4814b8062ff174c4d6ca6741a6be.scope: Deactivated successfully.
Dec 01 09:36:09 compute-0 sudo[254891]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:09 compute-0 sudo[255032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:09 compute-0 sudo[255032]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:09 compute-0 sudo[255032]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:09 compute-0 sudo[255057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:36:09 compute-0 sudo[255057]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:09 compute-0 sudo[255057]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:09 compute-0 sudo[255082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:09 compute-0 sudo[255082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:09 compute-0 sudo[255082]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:09 compute-0 sudo[255107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:36:09 compute-0 sudo[255107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v726: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:09 compute-0 podman[255172]: 2025-12-01 09:36:09.701835415 +0000 UTC m=+0.048860127 container create bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_ardinghelli, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:36:09 compute-0 systemd[1]: Started libpod-conmon-bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca.scope.
Dec 01 09:36:09 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:36:09 compute-0 podman[255172]: 2025-12-01 09:36:09.681000135 +0000 UTC m=+0.028024857 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:36:09 compute-0 podman[255172]: 2025-12-01 09:36:09.782207998 +0000 UTC m=+0.129232770 container init bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_ardinghelli, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Dec 01 09:36:09 compute-0 podman[255172]: 2025-12-01 09:36:09.790925298 +0000 UTC m=+0.137949970 container start bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_ardinghelli, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:36:09 compute-0 podman[255172]: 2025-12-01 09:36:09.794535222 +0000 UTC m=+0.141559924 container attach bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_ardinghelli, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:36:09 compute-0 compassionate_ardinghelli[255189]: 167 167
Dec 01 09:36:09 compute-0 systemd[1]: libpod-bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca.scope: Deactivated successfully.
Dec 01 09:36:09 compute-0 podman[255194]: 2025-12-01 09:36:09.842210424 +0000 UTC m=+0.031013393 container died bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_ardinghelli, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:36:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-c3a2224e4d60fe429060980f7176759a6dc04ec0f76e213ca1de2724364ab838-merged.mount: Deactivated successfully.
Dec 01 09:36:09 compute-0 podman[255194]: 2025-12-01 09:36:09.917054808 +0000 UTC m=+0.105857757 container remove bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_ardinghelli, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 01 09:36:09 compute-0 systemd[1]: libpod-conmon-bc873fc8fc569fc8a9fdd5ac7508594f44529c94e642bfcdecd14aac9019d4ca.scope: Deactivated successfully.
Dec 01 09:36:10 compute-0 podman[255215]: 2025-12-01 09:36:10.104880083 +0000 UTC m=+0.040048814 container create 8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_napier, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:36:10 compute-0 systemd[1]: Started libpod-conmon-8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1.scope.
Dec 01 09:36:10 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:36:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c74e876f351e30d0ace7c3ab4eb0465696110e805cb081579e98086d0b4409f6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c74e876f351e30d0ace7c3ab4eb0465696110e805cb081579e98086d0b4409f6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c74e876f351e30d0ace7c3ab4eb0465696110e805cb081579e98086d0b4409f6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c74e876f351e30d0ace7c3ab4eb0465696110e805cb081579e98086d0b4409f6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:36:10 compute-0 podman[255215]: 2025-12-01 09:36:10.088257944 +0000 UTC m=+0.023426695 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:36:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:10 compute-0 podman[255215]: 2025-12-01 09:36:10.19963918 +0000 UTC m=+0.134807931 container init 8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_napier, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:36:10 compute-0 podman[255215]: 2025-12-01 09:36:10.211058608 +0000 UTC m=+0.146227379 container start 8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_napier, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:36:10 compute-0 podman[255215]: 2025-12-01 09:36:10.21528003 +0000 UTC m=+0.150448761 container attach 8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_napier, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Dec 01 09:36:10 compute-0 ceph-mon[75031]: pgmap v726: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:11 compute-0 optimistic_napier[255231]: {
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "osd_id": 0,
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "type": "bluestore"
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:     },
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "osd_id": 1,
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "type": "bluestore"
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:     },
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "osd_id": 2,
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:         "type": "bluestore"
Dec 01 09:36:11 compute-0 optimistic_napier[255231]:     }
Dec 01 09:36:11 compute-0 optimistic_napier[255231]: }
Dec 01 09:36:11 compute-0 systemd[1]: libpod-8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1.scope: Deactivated successfully.
Dec 01 09:36:11 compute-0 systemd[1]: libpod-8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1.scope: Consumed 1.081s CPU time.
Dec 01 09:36:11 compute-0 podman[255215]: 2025-12-01 09:36:11.287453411 +0000 UTC m=+1.222622202 container died 8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_napier, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:36:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-c74e876f351e30d0ace7c3ab4eb0465696110e805cb081579e98086d0b4409f6-merged.mount: Deactivated successfully.
Dec 01 09:36:11 compute-0 podman[255215]: 2025-12-01 09:36:11.358921317 +0000 UTC m=+1.294090048 container remove 8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_napier, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:36:11 compute-0 systemd[1]: libpod-conmon-8dda21f7dfdf5284bb538f704c3666a58d8a790443f154538772e83b11da2bc1.scope: Deactivated successfully.
Dec 01 09:36:11 compute-0 sudo[255107]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:36:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:36:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:36:11 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:36:11 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 59e50a00-04ad-4c7e-a7c2-7f799bc4ed26 does not exist
Dec 01 09:36:11 compute-0 sudo[255279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:36:11 compute-0 sudo[255279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:11 compute-0 sudo[255279]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:11 compute-0 sudo[255304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:36:11 compute-0 sudo[255304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:36:11 compute-0 sudo[255304]: pam_unix(sudo:session): session closed for user root
Dec 01 09:36:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v727: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:36:12 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:36:13
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', '.mgr', 'vms', 'volumes', 'cephfs.cephfs.data', 'images', 'cephfs.cephfs.meta']
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:36:13 compute-0 ceph-mon[75031]: pgmap v727: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v728: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:14 compute-0 ceph-mon[75031]: pgmap v728: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v729: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:16 compute-0 ceph-mon[75031]: pgmap v729: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v730: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:36:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:36:18 compute-0 ceph-mon[75031]: pgmap v730: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v731: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e47 do_prune osdmap full prune enabled
Dec 01 09:36:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e48 e48: 3 total, 3 up, 3 in
Dec 01 09:36:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e48: 3 total, 3 up, 3 in
Dec 01 09:36:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:36:20.472 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:36:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:36:20.474 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:36:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:36:20.474 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:36:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e48 do_prune osdmap full prune enabled
Dec 01 09:36:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e49 e49: 3 total, 3 up, 3 in
Dec 01 09:36:20 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e49: 3 total, 3 up, 3 in
Dec 01 09:36:20 compute-0 ceph-mon[75031]: pgmap v731: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:20 compute-0 ceph-mon[75031]: osdmap e48: 3 total, 3 up, 3 in
Dec 01 09:36:20 compute-0 podman[255329]: 2025-12-01 09:36:20.9930098 +0000 UTC m=+0.091474103 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 01 09:36:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v734: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e49 do_prune osdmap full prune enabled
Dec 01 09:36:21 compute-0 ceph-mon[75031]: osdmap e49: 3 total, 3 up, 3 in
Dec 01 09:36:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e50 e50: 3 total, 3 up, 3 in
Dec 01 09:36:21 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e50: 3 total, 3 up, 3 in
Dec 01 09:36:22 compute-0 ceph-mon[75031]: pgmap v734: 193 pgs: 193 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:22 compute-0 ceph-mon[75031]: osdmap e50: 3 total, 3 up, 3 in
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #36. Immutable memtables: 0.
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.792420) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 36
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581782792459, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1590, "num_deletes": 251, "total_data_size": 1730078, "memory_usage": 1764144, "flush_reason": "Manual Compaction"}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #37: started
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581782805519, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 37, "file_size": 1676985, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14032, "largest_seqno": 15621, "table_properties": {"data_size": 1669695, "index_size": 4301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15173, "raw_average_key_size": 19, "raw_value_size": 1654857, "raw_average_value_size": 2163, "num_data_blocks": 198, "num_entries": 765, "num_filter_entries": 765, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581620, "oldest_key_time": 1764581620, "file_creation_time": 1764581782, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 13139 microseconds, and 5483 cpu microseconds.
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.805559) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #37: 1676985 bytes OK
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.805575) [db/memtable_list.cc:519] [default] Level-0 commit table #37 started
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.807626) [db/memtable_list.cc:722] [default] Level-0 commit table #37: memtable #1 done
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.807658) EVENT_LOG_v1 {"time_micros": 1764581782807650, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.807683) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1723149, prev total WAL file size 1723149, number of live WAL files 2.
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000033.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.808644) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [37(1637KB)], [35(5014KB)]
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581782808716, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [37], "files_L6": [35], "score": -1, "input_data_size": 6811516, "oldest_snapshot_seqno": -1}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #38: 3499 keys, 5621158 bytes, temperature: kUnknown
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581782849995, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 38, "file_size": 5621158, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5595316, "index_size": 16005, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8773, "raw_key_size": 82833, "raw_average_key_size": 23, "raw_value_size": 5529880, "raw_average_value_size": 1580, "num_data_blocks": 692, "num_entries": 3499, "num_filter_entries": 3499, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581782, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.850248) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 5621158 bytes
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.854321) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.7 rd, 135.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 4.9 +0.0 blob) out(5.4 +0.0 blob), read-write-amplify(7.4) write-amplify(3.4) OK, records in: 4017, records dropped: 518 output_compression: NoCompression
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.854340) EVENT_LOG_v1 {"time_micros": 1764581782854330, "job": 16, "event": "compaction_finished", "compaction_time_micros": 41365, "compaction_time_cpu_micros": 15424, "output_level": 6, "num_output_files": 1, "total_output_size": 5621158, "num_input_records": 4017, "num_output_records": 3499, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000037.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581782854691, "job": 16, "event": "table_file_deletion", "file_number": 37}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581782855498, "job": 16, "event": "table_file_deletion", "file_number": 35}
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.808521) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.855615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.855620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.855622) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.855623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:36:22 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:36:22.855625) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:36:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v736: 193 pgs: 193 active+clean; 16 MiB data, 96 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 2.7 MiB/s wr, 22 op/s
Dec 01 09:36:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e50 do_prune osdmap full prune enabled
Dec 01 09:36:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e51 e51: 3 total, 3 up, 3 in
Dec 01 09:36:23 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e51: 3 total, 3 up, 3 in
Dec 01 09:36:24 compute-0 ceph-mon[75031]: pgmap v736: 193 pgs: 193 active+clean; 16 MiB data, 96 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 2.7 MiB/s wr, 22 op/s
Dec 01 09:36:24 compute-0 ceph-mon[75031]: osdmap e51: 3 total, 3 up, 3 in
Dec 01 09:36:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v738: 193 pgs: 193 active+clean; 33 MiB data, 113 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 5.6 MiB/s wr, 63 op/s
Dec 01 09:36:26 compute-0 ceph-mon[75031]: pgmap v738: 193 pgs: 193 active+clean; 33 MiB data, 113 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 5.6 MiB/s wr, 63 op/s
Dec 01 09:36:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v739: 193 pgs: 193 active+clean; 41 MiB data, 121 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 5.9 MiB/s wr, 55 op/s
Dec 01 09:36:28 compute-0 ceph-mon[75031]: pgmap v739: 193 pgs: 193 active+clean; 41 MiB data, 121 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 5.9 MiB/s wr, 55 op/s
Dec 01 09:36:29 compute-0 podman[255349]: 2025-12-01 09:36:29.000517725 +0000 UTC m=+0.091271927 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 01 09:36:29 compute-0 nova_compute[250706]: 2025-12-01 09:36:29.047 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:29 compute-0 nova_compute[250706]: 2025-12-01 09:36:29.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v740: 193 pgs: 193 active+clean; 41 MiB data, 121 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:36:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e51 do_prune osdmap full prune enabled
Dec 01 09:36:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 e52: 3 total, 3 up, 3 in
Dec 01 09:36:30 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e52: 3 total, 3 up, 3 in
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.256 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.256 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.257 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.257 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.258 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.258 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.295 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.296 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.296 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.297 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.297 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:36:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:36:30 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1550246144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.783 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.977 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.978 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5310MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.979 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:36:30 compute-0 nova_compute[250706]: 2025-12-01 09:36:30.979 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.079 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.079 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.100 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:36:31 compute-0 ceph-mon[75031]: pgmap v740: 193 pgs: 193 active+clean; 41 MiB data, 121 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Dec 01 09:36:31 compute-0 ceph-mon[75031]: osdmap e52: 3 total, 3 up, 3 in
Dec 01 09:36:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1550246144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:36:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:36:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2661800007' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.578 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.584 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.607 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.608 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:36:31 compute-0 nova_compute[250706]: 2025-12-01 09:36:31.609 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:36:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v742: 193 pgs: 193 active+clean; 41 MiB data, 121 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.1 MiB/s wr, 30 op/s
Dec 01 09:36:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2661800007' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:36:32 compute-0 nova_compute[250706]: 2025-12-01 09:36:32.605 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:32 compute-0 nova_compute[250706]: 2025-12-01 09:36:32.628 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:32 compute-0 nova_compute[250706]: 2025-12-01 09:36:32.629 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:36:33 compute-0 ceph-mon[75031]: pgmap v742: 193 pgs: 193 active+clean; 41 MiB data, 121 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.1 MiB/s wr, 30 op/s
Dec 01 09:36:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v743: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 2.5 MiB/s wr, 25 op/s
Dec 01 09:36:33 compute-0 podman[255419]: 2025-12-01 09:36:33.989490653 +0000 UTC m=+0.075083842 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 01 09:36:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:35 compute-0 ceph-mon[75031]: pgmap v743: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 2.5 MiB/s wr, 25 op/s
Dec 01 09:36:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v744: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 819 KiB/s wr, 0 op/s
Dec 01 09:36:37 compute-0 ceph-mon[75031]: pgmap v744: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 819 KiB/s wr, 0 op/s
Dec 01 09:36:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v745: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:39 compute-0 ceph-mon[75031]: pgmap v745: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v746: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:41 compute-0 ceph-mon[75031]: pgmap v746: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v747: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:36:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:36:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:36:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:36:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:36:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:36:43 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:36:43.182 159899 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:9e:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '66:a0:73:58:3b:fd'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 09:36:43 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:36:43.183 159899 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 09:36:43 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:36:43.187 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 09:36:43 compute-0 ceph-mon[75031]: pgmap v747: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v748: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:36:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938621809' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:36:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:36:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938621809' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:36:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:45 compute-0 ceph-mon[75031]: pgmap v748: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1938621809' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:36:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1938621809' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:36:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v749: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:47 compute-0 ceph-mon[75031]: pgmap v749: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v750: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:49 compute-0 ceph-mon[75031]: pgmap v750: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v751: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:50 compute-0 rsyslogd[1007]: imjournal: 1311 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 01 09:36:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:51 compute-0 ceph-mon[75031]: pgmap v751: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v752: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:51 compute-0 podman[255439]: 2025-12-01 09:36:51.977892242 +0000 UTC m=+0.068140272 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 01 09:36:53 compute-0 ceph-mon[75031]: pgmap v752: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v753: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:36:55 compute-0 ceph-mon[75031]: pgmap v753: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v754: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e52 do_prune osdmap full prune enabled
Dec 01 09:36:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e53 e53: 3 total, 3 up, 3 in
Dec 01 09:36:56 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e53: 3 total, 3 up, 3 in
Dec 01 09:36:57 compute-0 ceph-mon[75031]: pgmap v754: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:36:57 compute-0 ceph-mon[75031]: osdmap e53: 3 total, 3 up, 3 in
Dec 01 09:36:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v756: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 102 B/s wr, 1 op/s
Dec 01 09:36:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e53 do_prune osdmap full prune enabled
Dec 01 09:36:59 compute-0 ceph-mon[75031]: pgmap v756: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 102 B/s wr, 1 op/s
Dec 01 09:36:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e54 e54: 3 total, 3 up, 3 in
Dec 01 09:36:59 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e54: 3 total, 3 up, 3 in
Dec 01 09:36:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v758: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 127 B/s wr, 2 op/s
Dec 01 09:37:00 compute-0 podman[255460]: 2025-12-01 09:37:00.016971917 +0000 UTC m=+0.117157642 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:37:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:00 compute-0 ceph-mon[75031]: osdmap e54: 3 total, 3 up, 3 in
Dec 01 09:37:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e54 do_prune osdmap full prune enabled
Dec 01 09:37:01 compute-0 ceph-mon[75031]: pgmap v758: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 127 B/s wr, 2 op/s
Dec 01 09:37:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e55 e55: 3 total, 3 up, 3 in
Dec 01 09:37:01 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e55: 3 total, 3 up, 3 in
Dec 01 09:37:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v760: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 5.3 KiB/s wr, 74 op/s
Dec 01 09:37:02 compute-0 ceph-mon[75031]: osdmap e55: 3 total, 3 up, 3 in
Dec 01 09:37:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e55 do_prune osdmap full prune enabled
Dec 01 09:37:03 compute-0 ceph-mon[75031]: pgmap v760: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 5.3 KiB/s wr, 74 op/s
Dec 01 09:37:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e56 e56: 3 total, 3 up, 3 in
Dec 01 09:37:03 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e56: 3 total, 3 up, 3 in
Dec 01 09:37:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v762: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 5.2 KiB/s wr, 74 op/s
Dec 01 09:37:04 compute-0 ceph-mon[75031]: osdmap e56: 3 total, 3 up, 3 in
Dec 01 09:37:04 compute-0 podman[255487]: 2025-12-01 09:37:04.980517334 +0000 UTC m=+0.079075827 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec 01 09:37:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e56 do_prune osdmap full prune enabled
Dec 01 09:37:05 compute-0 ceph-mon[75031]: pgmap v762: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 5.2 KiB/s wr, 74 op/s
Dec 01 09:37:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e57 e57: 3 total, 3 up, 3 in
Dec 01 09:37:05 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e57: 3 total, 3 up, 3 in
Dec 01 09:37:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v764: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 11 KiB/s wr, 170 op/s
Dec 01 09:37:06 compute-0 ceph-mon[75031]: osdmap e57: 3 total, 3 up, 3 in
Dec 01 09:37:06 compute-0 ceph-mon[75031]: pgmap v764: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 11 KiB/s wr, 170 op/s
Dec 01 09:37:07 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e57 do_prune osdmap full prune enabled
Dec 01 09:37:07 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e58 e58: 3 total, 3 up, 3 in
Dec 01 09:37:07 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e58: 3 total, 3 up, 3 in
Dec 01 09:37:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v766: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 98 KiB/s rd, 11 KiB/s wr, 139 op/s
Dec 01 09:37:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e58 do_prune osdmap full prune enabled
Dec 01 09:37:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e59 e59: 3 total, 3 up, 3 in
Dec 01 09:37:08 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e59: 3 total, 3 up, 3 in
Dec 01 09:37:08 compute-0 ceph-mon[75031]: osdmap e58: 3 total, 3 up, 3 in
Dec 01 09:37:08 compute-0 ceph-mon[75031]: pgmap v766: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 98 KiB/s rd, 11 KiB/s wr, 139 op/s
Dec 01 09:37:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e59 do_prune osdmap full prune enabled
Dec 01 09:37:09 compute-0 ceph-mon[75031]: osdmap e59: 3 total, 3 up, 3 in
Dec 01 09:37:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e60 e60: 3 total, 3 up, 3 in
Dec 01 09:37:09 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e60: 3 total, 3 up, 3 in
Dec 01 09:37:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v769: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 7.4 KiB/s wr, 57 op/s
Dec 01 09:37:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e60 do_prune osdmap full prune enabled
Dec 01 09:37:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e61 e61: 3 total, 3 up, 3 in
Dec 01 09:37:10 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e61: 3 total, 3 up, 3 in
Dec 01 09:37:10 compute-0 ceph-mon[75031]: osdmap e60: 3 total, 3 up, 3 in
Dec 01 09:37:10 compute-0 ceph-mon[75031]: pgmap v769: 193 pgs: 193 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 7.4 KiB/s wr, 57 op/s
Dec 01 09:37:10 compute-0 ceph-mon[75031]: osdmap e61: 3 total, 3 up, 3 in
Dec 01 09:37:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e61 do_prune osdmap full prune enabled
Dec 01 09:37:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e62 e62: 3 total, 3 up, 3 in
Dec 01 09:37:11 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e62: 3 total, 3 up, 3 in
Dec 01 09:37:11 compute-0 sudo[255509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:11 compute-0 sudo[255509]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:11 compute-0 sudo[255509]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v772: 193 pgs: 193 active+clean; 105 MiB data, 186 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 16 MiB/s wr, 172 op/s
Dec 01 09:37:11 compute-0 sudo[255534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:37:11 compute-0 sudo[255534]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:11 compute-0 sudo[255534]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:11 compute-0 sudo[255559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:11 compute-0 sudo[255559]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:11 compute-0 sudo[255559]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:11 compute-0 sudo[255584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:37:11 compute-0 sudo[255584]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e62 do_prune osdmap full prune enabled
Dec 01 09:37:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e63 e63: 3 total, 3 up, 3 in
Dec 01 09:37:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e63: 3 total, 3 up, 3 in
Dec 01 09:37:12 compute-0 ceph-mon[75031]: osdmap e62: 3 total, 3 up, 3 in
Dec 01 09:37:12 compute-0 podman[255681]: 2025-12-01 09:37:12.584662175 +0000 UTC m=+0.081588508 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:37:12 compute-0 podman[255681]: 2025-12-01 09:37:12.70991858 +0000 UTC m=+0.206844923 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:37:13
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'vms', 'images', 'volumes', '.mgr', 'cephfs.cephfs.data']
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:37:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e63 do_prune osdmap full prune enabled
Dec 01 09:37:13 compute-0 ceph-mon[75031]: pgmap v772: 193 pgs: 193 active+clean; 105 MiB data, 186 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 16 MiB/s wr, 172 op/s
Dec 01 09:37:13 compute-0 ceph-mon[75031]: osdmap e63: 3 total, 3 up, 3 in
Dec 01 09:37:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e64 e64: 3 total, 3 up, 3 in
Dec 01 09:37:13 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e64: 3 total, 3 up, 3 in
Dec 01 09:37:13 compute-0 sudo[255584]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:37:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:37:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:13 compute-0 sudo[255821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:13 compute-0 sudo[255821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:13 compute-0 sudo[255821]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:13 compute-0 sudo[255846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:37:13 compute-0 sudo[255846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:13 compute-0 sudo[255846]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:13 compute-0 sudo[255871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:13 compute-0 sudo[255871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:13 compute-0 sudo[255871]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:13 compute-0 sudo[255896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:37:13 compute-0 sudo[255896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v775: 193 pgs: 193 active+clean; 105 MiB data, 186 MiB used, 60 GiB / 60 GiB avail; 234 KiB/s rd, 28 MiB/s wr, 333 op/s
Dec 01 09:37:14 compute-0 sudo[255896]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:37:14 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:37:14 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:37:14 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:14 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 7fd36b44-d0b6-42f0-a75d-245c120b3058 does not exist
Dec 01 09:37:14 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 94a9b564-905c-46a4-9bda-44359844445a does not exist
Dec 01 09:37:14 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 8c8a5006-de33-43f5-baaa-981aa2e9e3da does not exist
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:37:14 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:37:14 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:37:14 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:37:14 compute-0 sudo[255951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:14 compute-0 sudo[255951]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:14 compute-0 sudo[255951]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e64 do_prune osdmap full prune enabled
Dec 01 09:37:14 compute-0 ceph-mon[75031]: osdmap e64: 3 total, 3 up, 3 in
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:37:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e65 e65: 3 total, 3 up, 3 in
Dec 01 09:37:14 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e65: 3 total, 3 up, 3 in
Dec 01 09:37:14 compute-0 sudo[255976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:37:14 compute-0 sudo[255976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:14 compute-0 sudo[255976]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:14 compute-0 sudo[256001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:14 compute-0 sudo[256001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:14 compute-0 sudo[256001]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:14 compute-0 sudo[256026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:37:14 compute-0 sudo[256026]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:14 compute-0 podman[256091]: 2025-12-01 09:37:14.773920482 +0000 UTC m=+0.054583761 container create def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:37:14 compute-0 systemd[1]: Started libpod-conmon-def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e.scope.
Dec 01 09:37:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:37:14 compute-0 podman[256091]: 2025-12-01 09:37:14.745532566 +0000 UTC m=+0.026195925 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:37:14 compute-0 podman[256091]: 2025-12-01 09:37:14.85202234 +0000 UTC m=+0.132685629 container init def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Dec 01 09:37:14 compute-0 podman[256091]: 2025-12-01 09:37:14.859542226 +0000 UTC m=+0.140205495 container start def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:37:14 compute-0 crazy_rubin[256107]: 167 167
Dec 01 09:37:14 compute-0 podman[256091]: 2025-12-01 09:37:14.864238141 +0000 UTC m=+0.144901410 container attach def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:37:14 compute-0 systemd[1]: libpod-def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e.scope: Deactivated successfully.
Dec 01 09:37:14 compute-0 podman[256091]: 2025-12-01 09:37:14.866105825 +0000 UTC m=+0.146769094 container died def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:37:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-5fbaab6f1aab82115def2925b4fa2bfe38057812eedd4c773d4c7eb957c71fd4-merged.mount: Deactivated successfully.
Dec 01 09:37:14 compute-0 podman[256091]: 2025-12-01 09:37:14.905237111 +0000 UTC m=+0.185900370 container remove def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_rubin, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:37:14 compute-0 systemd[1]: libpod-conmon-def1b72fbcdccdd2f4c5812858da626e956def05b2808d8ed46e3713fc7b862e.scope: Deactivated successfully.
Dec 01 09:37:15 compute-0 podman[256129]: 2025-12-01 09:37:15.087608759 +0000 UTC m=+0.039871228 container create ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:37:15 compute-0 systemd[1]: Started libpod-conmon-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope.
Dec 01 09:37:15 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:37:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:15 compute-0 podman[256129]: 2025-12-01 09:37:15.067735857 +0000 UTC m=+0.019998246 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:37:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:15 compute-0 podman[256129]: 2025-12-01 09:37:15.179419231 +0000 UTC m=+0.131681670 container init ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:37:15 compute-0 podman[256129]: 2025-12-01 09:37:15.192610751 +0000 UTC m=+0.144873160 container start ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:37:15 compute-0 podman[256129]: 2025-12-01 09:37:15.196327977 +0000 UTC m=+0.148590416 container attach ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:37:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:15 compute-0 ceph-mon[75031]: pgmap v775: 193 pgs: 193 active+clean; 105 MiB data, 186 MiB used, 60 GiB / 60 GiB avail; 234 KiB/s rd, 28 MiB/s wr, 333 op/s
Dec 01 09:37:15 compute-0 ceph-mon[75031]: osdmap e65: 3 total, 3 up, 3 in
Dec 01 09:37:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e65 do_prune osdmap full prune enabled
Dec 01 09:37:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e66 e66: 3 total, 3 up, 3 in
Dec 01 09:37:15 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e66: 3 total, 3 up, 3 in
Dec 01 09:37:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v778: 193 pgs: 193 active+clean; 41 MiB data, 126 MiB used, 60 GiB / 60 GiB avail; 213 KiB/s rd, 12 MiB/s wr, 305 op/s
Dec 01 09:37:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e66 do_prune osdmap full prune enabled
Dec 01 09:37:16 compute-0 ceph-mon[75031]: osdmap e66: 3 total, 3 up, 3 in
Dec 01 09:37:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e67 e67: 3 total, 3 up, 3 in
Dec 01 09:37:16 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e67: 3 total, 3 up, 3 in
Dec 01 09:37:16 compute-0 naughty_gauss[256145]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:37:16 compute-0 naughty_gauss[256145]: --> relative data size: 1.0
Dec 01 09:37:16 compute-0 naughty_gauss[256145]: --> All data devices are unavailable
Dec 01 09:37:16 compute-0 systemd[1]: libpod-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope: Deactivated successfully.
Dec 01 09:37:16 compute-0 systemd[1]: libpod-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope: Consumed 1.177s CPU time.
Dec 01 09:37:16 compute-0 podman[256129]: 2025-12-01 09:37:16.436760681 +0000 UTC m=+1.389023090 container died ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:37:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-780fc2eadec1ee4711f5812ca6edb98d4090c3f4388eb318fcbf398af6518aa6-merged.mount: Deactivated successfully.
Dec 01 09:37:16 compute-0 podman[256129]: 2025-12-01 09:37:16.504775588 +0000 UTC m=+1.457037967 container remove ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_gauss, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 09:37:16 compute-0 systemd[1]: libpod-conmon-ef95e3a4933dbe3f413b40f1a990d05dc6062f59114e8930bc77819522aaf54a.scope: Deactivated successfully.
Dec 01 09:37:16 compute-0 sudo[256026]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:16 compute-0 sudo[256184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:16 compute-0 sudo[256184]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:16 compute-0 sudo[256184]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:16 compute-0 sudo[256209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:37:16 compute-0 sudo[256209]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:16 compute-0 sudo[256209]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:16 compute-0 sudo[256234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:16 compute-0 sudo[256234]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:16 compute-0 sudo[256234]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:16 compute-0 sudo[256259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:37:16 compute-0 sudo[256259]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:17 compute-0 podman[256324]: 2025-12-01 09:37:17.192134007 +0000 UTC m=+0.069257704 container create a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:37:17 compute-0 systemd[1]: Started libpod-conmon-a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab.scope.
Dec 01 09:37:17 compute-0 podman[256324]: 2025-12-01 09:37:17.158410637 +0000 UTC m=+0.035534364 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:37:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:37:17 compute-0 podman[256324]: 2025-12-01 09:37:17.27947908 +0000 UTC m=+0.156602817 container init a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec 01 09:37:17 compute-0 podman[256324]: 2025-12-01 09:37:17.285183555 +0000 UTC m=+0.162307272 container start a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:37:17 compute-0 vigilant_volhard[256340]: 167 167
Dec 01 09:37:17 compute-0 systemd[1]: libpod-a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab.scope: Deactivated successfully.
Dec 01 09:37:17 compute-0 podman[256324]: 2025-12-01 09:37:17.28953963 +0000 UTC m=+0.166663347 container attach a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:37:17 compute-0 podman[256324]: 2025-12-01 09:37:17.289824648 +0000 UTC m=+0.166948355 container died a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:37:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-5a932a4de8bea6e555a3ac17f20413f624c6c021ab30ad9f6aeaf8daf3d5d9d7-merged.mount: Deactivated successfully.
Dec 01 09:37:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e67 do_prune osdmap full prune enabled
Dec 01 09:37:17 compute-0 podman[256324]: 2025-12-01 09:37:17.330791507 +0000 UTC m=+0.207915204 container remove a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigilant_volhard, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:37:17 compute-0 ceph-mon[75031]: pgmap v778: 193 pgs: 193 active+clean; 41 MiB data, 126 MiB used, 60 GiB / 60 GiB avail; 213 KiB/s rd, 12 MiB/s wr, 305 op/s
Dec 01 09:37:17 compute-0 ceph-mon[75031]: osdmap e67: 3 total, 3 up, 3 in
Dec 01 09:37:17 compute-0 systemd[1]: libpod-conmon-a4e89caa0f5ed97d2eabf9c72ecdc6dc0814ba161b1b6cd4a5957b183a8f15ab.scope: Deactivated successfully.
Dec 01 09:37:17 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e68 e68: 3 total, 3 up, 3 in
Dec 01 09:37:17 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e68: 3 total, 3 up, 3 in
Dec 01 09:37:17 compute-0 podman[256365]: 2025-12-01 09:37:17.519072495 +0000 UTC m=+0.023221719 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:37:17 compute-0 podman[256365]: 2025-12-01 09:37:17.622151631 +0000 UTC m=+0.126300865 container create b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:37:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v781: 193 pgs: 193 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 195 KiB/s rd, 17 KiB/s wr, 268 op/s
Dec 01 09:37:17 compute-0 systemd[1]: Started libpod-conmon-b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8.scope.
Dec 01 09:37:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:37:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:17 compute-0 podman[256365]: 2025-12-01 09:37:17.734976048 +0000 UTC m=+0.239125342 container init b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:37:17 compute-0 podman[256365]: 2025-12-01 09:37:17.746231652 +0000 UTC m=+0.250380856 container start b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:37:17 compute-0 podman[256365]: 2025-12-01 09:37:17.771913101 +0000 UTC m=+0.276062345 container attach b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:37:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e68 do_prune osdmap full prune enabled
Dec 01 09:37:18 compute-0 ceph-mon[75031]: osdmap e68: 3 total, 3 up, 3 in
Dec 01 09:37:18 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e69 e69: 3 total, 3 up, 3 in
Dec 01 09:37:18 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e69: 3 total, 3 up, 3 in
Dec 01 09:37:18 compute-0 romantic_taussig[256381]: {
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:     "0": [
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:         {
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "devices": [
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "/dev/loop3"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             ],
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_name": "ceph_lv0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_size": "21470642176",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "name": "ceph_lv0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "tags": {
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cluster_name": "ceph",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.crush_device_class": "",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.encrypted": "0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osd_id": "0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.type": "block",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.vdo": "0"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             },
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "type": "block",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "vg_name": "ceph_vg0"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:         }
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:     ],
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:     "1": [
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:         {
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "devices": [
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "/dev/loop4"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             ],
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_name": "ceph_lv1",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_size": "21470642176",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "name": "ceph_lv1",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "tags": {
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cluster_name": "ceph",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.crush_device_class": "",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.encrypted": "0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osd_id": "1",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.type": "block",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.vdo": "0"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             },
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "type": "block",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "vg_name": "ceph_vg1"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:         }
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:     ],
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:     "2": [
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:         {
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "devices": [
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "/dev/loop5"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             ],
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_name": "ceph_lv2",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_size": "21470642176",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "name": "ceph_lv2",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "tags": {
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.cluster_name": "ceph",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.crush_device_class": "",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.encrypted": "0",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osd_id": "2",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.type": "block",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:                 "ceph.vdo": "0"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             },
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "type": "block",
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:             "vg_name": "ceph_vg2"
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:         }
Dec 01 09:37:18 compute-0 romantic_taussig[256381]:     ]
Dec 01 09:37:18 compute-0 romantic_taussig[256381]: }
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.559 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "6740b382-574d-4ced-a156-11a531b94114" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.561 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.581 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 01 09:37:18 compute-0 systemd[1]: libpod-b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8.scope: Deactivated successfully.
Dec 01 09:37:18 compute-0 podman[256365]: 2025-12-01 09:37:18.588282192 +0000 UTC m=+1.092431396 container died b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.9077212346161359e-07 of space, bias 1.0, pg target 5.723163703848408e-05 quantized to 32 (current 32)
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006672572971609038 of space, bias 1.0, pg target 0.20017718914827115 quantized to 32 (current 32)
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:37:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:37:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-ae3a8a21a23d8773fcfc890b07dc550d22ab70c32a7f231b1353f336fdc1f74b-merged.mount: Deactivated successfully.
Dec 01 09:37:18 compute-0 podman[256365]: 2025-12-01 09:37:18.647261339 +0000 UTC m=+1.151410543 container remove b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_taussig, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:37:18 compute-0 systemd[1]: libpod-conmon-b542f996e2cef15848c638d0c5cac34b17138568a872fc6ff447b2a7851e59c8.scope: Deactivated successfully.
Dec 01 09:37:18 compute-0 sudo[256259]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.689 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.690 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.698 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.698 250710 INFO nova.compute.claims [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Claim successful on node compute-0.ctlplane.example.com
Dec 01 09:37:18 compute-0 sudo[256402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:18 compute-0 sudo[256402]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:18 compute-0 sudo[256402]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:18 compute-0 nova_compute[250706]: 2025-12-01 09:37:18.815 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:18 compute-0 sudo[256427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:37:18 compute-0 sudo[256427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:18 compute-0 sudo[256427]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:18 compute-0 sudo[256453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:18 compute-0 sudo[256453]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:18 compute-0 sudo[256453]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:18 compute-0 sudo[256478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:37:18 compute-0 sudo[256478]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:37:19 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/149037888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.252 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.259 250710 DEBUG nova.compute.provider_tree [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.292 250710 DEBUG nova.scheduler.client.report [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.318 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.319 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 01 09:37:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e69 do_prune osdmap full prune enabled
Dec 01 09:37:19 compute-0 ceph-mon[75031]: pgmap v781: 193 pgs: 193 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 195 KiB/s rd, 17 KiB/s wr, 268 op/s
Dec 01 09:37:19 compute-0 ceph-mon[75031]: osdmap e69: 3 total, 3 up, 3 in
Dec 01 09:37:19 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/149037888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:19 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e70 e70: 3 total, 3 up, 3 in
Dec 01 09:37:19 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e70: 3 total, 3 up, 3 in
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.368 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.369 250710 DEBUG nova.network.neutron [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 01 09:37:19 compute-0 podman[256561]: 2025-12-01 09:37:19.36945535 +0000 UTC m=+0.063599811 container create 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.397 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.423 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 01 09:37:19 compute-0 podman[256561]: 2025-12-01 09:37:19.339854018 +0000 UTC m=+0.033998509 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:37:19 compute-0 systemd[1]: Started libpod-conmon-289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047.scope.
Dec 01 09:37:19 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.477 250710 INFO nova.virt.block_device [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Booting with volume 61c9bc29-5cbf-4816-a0ae-b24ddf88776c at /dev/vda
Dec 01 09:37:19 compute-0 podman[256561]: 2025-12-01 09:37:19.48484022 +0000 UTC m=+0.178984661 container init 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True)
Dec 01 09:37:19 compute-0 podman[256561]: 2025-12-01 09:37:19.496324741 +0000 UTC m=+0.190469162 container start 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:37:19 compute-0 podman[256561]: 2025-12-01 09:37:19.49875199 +0000 UTC m=+0.192896431 container attach 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:37:19 compute-0 musing_goldstine[256577]: 167 167
Dec 01 09:37:19 compute-0 systemd[1]: libpod-289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047.scope: Deactivated successfully.
Dec 01 09:37:19 compute-0 podman[256561]: 2025-12-01 09:37:19.502967412 +0000 UTC m=+0.197111833 container died 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:37:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-a1259d33cc225eafb375638b21ca8e7ecf04d38db0b9d7d11879af4abc34a161-merged.mount: Deactivated successfully.
Dec 01 09:37:19 compute-0 podman[256561]: 2025-12-01 09:37:19.5432302 +0000 UTC m=+0.237374621 container remove 289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_goldstine, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:37:19 compute-0 systemd[1]: libpod-conmon-289c1432a17d9c7b909d3ed355bbcbba67882c700f4c4a8b6b2ae0926e4dc047.scope: Deactivated successfully.
Dec 01 09:37:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v784: 193 pgs: 193 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 91 KiB/s rd, 7.5 KiB/s wr, 123 op/s
Dec 01 09:37:19 compute-0 podman[256601]: 2025-12-01 09:37:19.737533732 +0000 UTC m=+0.062103119 container create d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:37:19 compute-0 systemd[1]: Started libpod-conmon-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope.
Dec 01 09:37:19 compute-0 podman[256601]: 2025-12-01 09:37:19.712155741 +0000 UTC m=+0.036725218 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:37:19 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:37:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:37:19 compute-0 podman[256601]: 2025-12-01 09:37:19.842993256 +0000 UTC m=+0.167562663 container init d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Dec 01 09:37:19 compute-0 podman[256601]: 2025-12-01 09:37:19.854914409 +0000 UTC m=+0.179483796 container start d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:37:19 compute-0 podman[256601]: 2025-12-01 09:37:19.858499432 +0000 UTC m=+0.183068819 container attach d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.978 250710 DEBUG os_brick.utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.100', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-0.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176
Dec 01 09:37:19 compute-0 nova_compute[250706]: 2025-12-01 09:37:19.984 250710 INFO oslo.privsep.daemon [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp9do_h71r/privsep.sock']
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.126 250710 DEBUG nova.network.neutron [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.127 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 01 09:37:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e70 do_prune osdmap full prune enabled
Dec 01 09:37:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e71 e71: 3 total, 3 up, 3 in
Dec 01 09:37:20 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e71: 3 total, 3 up, 3 in
Dec 01 09:37:20 compute-0 ceph-mon[75031]: osdmap e70: 3 total, 3 up, 3 in
Dec 01 09:37:20 compute-0 ceph-mon[75031]: osdmap e71: 3 total, 3 up, 3 in
Dec 01 09:37:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:37:20.474 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:37:20.475 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:37:20.475 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.829 250710 INFO oslo.privsep.daemon [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Spawned new privsep daemon via rootwrap
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.692 256632 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.696 256632 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.698 256632 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.698 256632 INFO oslo.privsep.daemon [-] privsep daemon running as pid 256632
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.833 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[db9da8ff-f118-4412-a8a0-fa71303549fc]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]: {
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "osd_id": 0,
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "type": "bluestore"
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:     },
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "osd_id": 1,
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "type": "bluestore"
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:     },
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "osd_id": 2,
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:         "type": "bluestore"
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]:     }
Dec 01 09:37:20 compute-0 quizzical_zhukovsky[256618]: }
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.931 256632 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.945 256632 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.946 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d93fb9-a643-4c8c-b079-d2447c5695d2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.947 256632 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:20 compute-0 systemd[1]: libpod-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope: Deactivated successfully.
Dec 01 09:37:20 compute-0 podman[256601]: 2025-12-01 09:37:20.948553219 +0000 UTC m=+1.273122616 container died d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:37:20 compute-0 systemd[1]: libpod-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope: Consumed 1.087s CPU time.
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.957 256632 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.959 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[02b9dbb7-b490-441a-8bc2-1d8b92c8aaad]: (4, ('InitiatorName=iqn.1994-05.com.redhat:44dd6092e7fe', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.961 256632 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-d4bcb6f30d6ac7e44735a9bf19ceaaff7b0ff9b47c1e6362b7fbed9cdd91cb89-merged.mount: Deactivated successfully.
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.980 256632 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.980 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d225cb-38c3-4f3c-97f5-f7b22ef1f71d]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.983 256632 DEBUG oslo.privsep.daemon [-] privsep: reply[bffc43dc-ac5b-47dd-8247-e84f0c87e14b]: (4, '52310927-1d30-4bda-9d2b-fd9f7cfadc4d') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 09:37:20 compute-0 nova_compute[250706]: 2025-12-01 09:37:20.983 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:21 compute-0 podman[256601]: 2025-12-01 09:37:21.001962956 +0000 UTC m=+1.326532363 container remove d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_zhukovsky, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:37:21 compute-0 nova_compute[250706]: 2025-12-01 09:37:21.007 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:21 compute-0 systemd[1]: libpod-conmon-d2b8a7b3ac5030eedc1225d1ea69d9f767277a7af8cd9933888c91084905f130.scope: Deactivated successfully.
Dec 01 09:37:21 compute-0 nova_compute[250706]: 2025-12-01 09:37:21.010 250710 DEBUG os_brick.initiator.connectors.lightos [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98
Dec 01 09:37:21 compute-0 nova_compute[250706]: 2025-12-01 09:37:21.012 250710 DEBUG os_brick.initiator.connectors.lightos [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76
Dec 01 09:37:21 compute-0 nova_compute[250706]: 2025-12-01 09:37:21.012 250710 DEBUG os_brick.initiator.connectors.lightos [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:a5e1c538-e584-4a91-80ec-e9f1f28a5aed dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79
Dec 01 09:37:21 compute-0 nova_compute[250706]: 2025-12-01 09:37:21.013 250710 DEBUG os_brick.utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] <== get_connector_properties: return (1030ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.100', 'host': 'compute-0.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:44dd6092e7fe', 'do_local_attach': False, 'nvme_hostid': 'a5e1c538-e584-4a91-80ec-e9f1f28a5aed', 'system uuid': '52310927-1d30-4bda-9d2b-fd9f7cfadc4d', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:a5e1c538-e584-4a91-80ec-e9f1f28a5aed', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203
Dec 01 09:37:21 compute-0 nova_compute[250706]: 2025-12-01 09:37:21.013 250710 DEBUG nova.virt.block_device [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating existing volume attachment record: 48a15dd1-d08e-4947-98cd-2a9168bf85d9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631
Dec 01 09:37:21 compute-0 sudo[256478]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:37:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:37:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:21 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 91f242c3-8997-4e22-8df6-7a888fc8ffd5 does not exist
Dec 01 09:37:21 compute-0 sudo[256676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:37:21 compute-0 sudo[256676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:21 compute-0 sudo[256676]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:21 compute-0 sudo[256701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:37:21 compute-0 sudo[256701]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:37:21 compute-0 sudo[256701]: pam_unix(sudo:session): session closed for user root
Dec 01 09:37:21 compute-0 ceph-mon[75031]: pgmap v784: 193 pgs: 193 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 91 KiB/s rd, 7.5 KiB/s wr, 123 op/s
Dec 01 09:37:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:37:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v786: 193 pgs: 2 active+clean+snaptrim, 8 active+clean+snaptrim_wait, 183 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 288 KiB/s rd, 25 KiB/s wr, 393 op/s
Dec 01 09:37:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec 01 09:37:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1389057543' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:22 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1389057543' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.482 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.484 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.484 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Creating image(s)
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.485 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.485 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Ensure instance console log exists: /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.485 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.486 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.486 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.488 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'guest_format': None, 'attachment_id': '48a15dd1-d08e-4947-98cd-2a9168bf85d9', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-61c9bc29-5cbf-4816-a0ae-b24ddf88776c', 'hosts': ['192.168.122.100'], 'ports': ['6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '61c9bc29-5cbf-4816-a0ae-b24ddf88776c', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '6740b382-574d-4ced-a156-11a531b94114', 'attached_at': '', 'detached_at': '', 'volume_id': '61c9bc29-5cbf-4816-a0ae-b24ddf88776c', 'serial': '61c9bc29-5cbf-4816-a0ae-b24ddf88776c'}, 'device_type': 'disk', 'delete_on_termination': True, 'boot_index': 0, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.494 250710 WARNING nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.501 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.501 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.504 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.505 250710 DEBUG nova.virt.libvirt.host [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.505 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.505 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T09:36:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dff9230f-1656-4ee2-9f6d-710f2458058e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.506 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.507 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.508 250710 DEBUG nova.virt.hardware [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.533 250710 DEBUG nova.storage.rbd_utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image 6740b382-574d-4ced-a156-11a531b94114_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.538 250710 DEBUG nova.privsep.utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.539 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec 01 09:37:22 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3560338767' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:22 compute-0 podman[256764]: 2025-12-01 09:37:22.976156283 +0000 UTC m=+0.072650241 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 01 09:37:22 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.997 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:22.999 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.001 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.004 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:23 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 01 09:37:23 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.107 250710 DEBUG nova.objects.instance [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'pci_devices' on Instance uuid 6740b382-574d-4ced-a156-11a531b94114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.131 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] End _get_guest_xml xml=<domain type="kvm">
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <uuid>6740b382-574d-4ced-a156-11a531b94114</uuid>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <name>instance-00000001</name>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <memory>131072</memory>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <vcpu>1</vcpu>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <metadata>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <nova:name>instance-depend-image</nova:name>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <nova:creationTime>2025-12-01 09:37:22</nova:creationTime>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <nova:flavor name="m1.nano">
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <nova:memory>128</nova:memory>
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <nova:disk>1</nova:disk>
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <nova:swap>0</nova:swap>
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <nova:ephemeral>0</nova:ephemeral>
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <nova:vcpus>1</nova:vcpus>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       </nova:flavor>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <nova:owner>
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <nova:user uuid="14165de8e6af473c94a109257a29c50c">tempest-ImageDependencyTests-805054756-project-member</nova:user>
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <nova:project uuid="8a9d236048d24c39893cd69ad598bc1a">tempest-ImageDependencyTests-805054756</nova:project>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       </nova:owner>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <nova:ports/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </nova:instance>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   </metadata>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <sysinfo type="smbios">
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <system>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <entry name="manufacturer">RDO</entry>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <entry name="product">OpenStack Compute</entry>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <entry name="serial">6740b382-574d-4ced-a156-11a531b94114</entry>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <entry name="uuid">6740b382-574d-4ced-a156-11a531b94114</entry>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <entry name="family">Virtual Machine</entry>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </system>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   </sysinfo>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <os>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <boot dev="hd"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <smbios mode="sysinfo"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   </os>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <features>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <acpi/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <apic/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <vmcoreinfo/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   </features>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <clock offset="utc">
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <timer name="pit" tickpolicy="delay"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <timer name="hpet" present="no"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   </clock>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <cpu mode="host-model" match="exact">
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <topology sockets="1" cores="1" threads="1"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   </cpu>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   <devices>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <disk type="network" device="cdrom">
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <driver type="raw" cache="none"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <source protocol="rbd" name="vms/6740b382-574d-4ced-a156-11a531b94114_disk.config">
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <host name="192.168.122.100" port="6789"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       </source>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <auth username="openstack">
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       </auth>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <target dev="sda" bus="sata"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <disk type="network" device="disk">
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <source protocol="rbd" name="volumes/volume-61c9bc29-5cbf-4816-a0ae-b24ddf88776c">
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <host name="192.168.122.100" port="6789"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       </source>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <auth username="openstack">
Dec 01 09:37:23 compute-0 nova_compute[250706]:         <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       </auth>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <target dev="vda" bus="virtio"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <serial>61c9bc29-5cbf-4816-a0ae-b24ddf88776c</serial>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <serial type="pty">
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <log file="/var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/console.log" append="off"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </serial>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <video>
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <model type="virtio"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </video>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <input type="tablet" bus="usb"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <rng model="virtio">
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <backend model="random">/dev/urandom</backend>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </rng>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <controller type="usb" index="0"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     <memballoon model="virtio">
Dec 01 09:37:23 compute-0 nova_compute[250706]:       <stats period="10"/>
Dec 01 09:37:23 compute-0 nova_compute[250706]:     </memballoon>
Dec 01 09:37:23 compute-0 nova_compute[250706]:   </devices>
Dec 01 09:37:23 compute-0 nova_compute[250706]: </domain>
Dec 01 09:37:23 compute-0 nova_compute[250706]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.188 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.188 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.189 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Using config drive
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.217 250710 DEBUG nova.storage.rbd_utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image 6740b382-574d-4ced-a156-11a531b94114_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:23 compute-0 ceph-mon[75031]: pgmap v786: 193 pgs: 2 active+clean+snaptrim, 8 active+clean+snaptrim_wait, 183 active+clean; 41 MiB data, 130 MiB used, 60 GiB / 60 GiB avail; 288 KiB/s rd, 25 KiB/s wr, 393 op/s
Dec 01 09:37:23 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3560338767' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v787: 193 pgs: 2 active+clean+snaptrim, 8 active+clean+snaptrim_wait, 183 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 166 KiB/s rd, 15 KiB/s wr, 230 op/s
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.837 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Creating config drive at /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.842 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2l2wouwm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.973 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2l2wouwm" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:23 compute-0 nova_compute[250706]: 2025-12-01 09:37:23.998 250710 DEBUG nova.storage.rbd_utils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image 6740b382-574d-4ced-a156-11a531b94114_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:24 compute-0 nova_compute[250706]: 2025-12-01 09:37:24.002 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config 6740b382-574d-4ced-a156-11a531b94114_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e71 do_prune osdmap full prune enabled
Dec 01 09:37:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e72 e72: 3 total, 3 up, 3 in
Dec 01 09:37:24 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e72: 3 total, 3 up, 3 in
Dec 01 09:37:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e72 do_prune osdmap full prune enabled
Dec 01 09:37:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e73 e73: 3 total, 3 up, 3 in
Dec 01 09:37:25 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e73: 3 total, 3 up, 3 in
Dec 01 09:37:25 compute-0 nova_compute[250706]: 2025-12-01 09:37:25.335 250710 DEBUG oslo_concurrency.processutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config 6740b382-574d-4ced-a156-11a531b94114_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.333s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:25 compute-0 nova_compute[250706]: 2025-12-01 09:37:25.336 250710 INFO nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deleting local config drive /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114/disk.config because it was imported into RBD.
Dec 01 09:37:25 compute-0 ceph-mon[75031]: pgmap v787: 193 pgs: 2 active+clean+snaptrim, 8 active+clean+snaptrim_wait, 183 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 166 KiB/s rd, 15 KiB/s wr, 230 op/s
Dec 01 09:37:25 compute-0 ceph-mon[75031]: osdmap e72: 3 total, 3 up, 3 in
Dec 01 09:37:25 compute-0 ceph-mon[75031]: osdmap e73: 3 total, 3 up, 3 in
Dec 01 09:37:25 compute-0 systemd-machined[212908]: New machine qemu-1-instance-00000001.
Dec 01 09:37:25 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 01 09:37:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v790: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 176 KiB/s rd, 16 KiB/s wr, 244 op/s
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.210 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.213 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.214 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581846.211662, 6740b382-574d-4ced-a156-11a531b94114 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.214 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] VM Resumed (Lifecycle Event)
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.223 250710 INFO nova.virt.libvirt.driver [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance spawned successfully.
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.224 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.296 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.300 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.314 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.314 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.315 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.315 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.316 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.316 250710 DEBUG nova.virt.libvirt.driver [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.322 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.322 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581846.2117918, 6740b382-574d-4ced-a156-11a531b94114 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.322 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] VM Started (Lifecycle Event)
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.379 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.383 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.388 250710 INFO nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 3.90 seconds to spawn the instance on the hypervisor.
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.389 250710 DEBUG nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.405 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.469 250710 INFO nova.compute.manager [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 7.82 seconds to build instance.
Dec 01 09:37:26 compute-0 nova_compute[250706]: 2025-12-01 09:37:26.491 250710 DEBUG oslo_concurrency.lockutils [None req-9b49992c-241a-4497-a7e1-00c2cbe88f67 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:27 compute-0 nova_compute[250706]: 2025-12-01 09:37:27.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:27 compute-0 nova_compute[250706]: 2025-12-01 09:37:27.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 09:37:27 compute-0 nova_compute[250706]: 2025-12-01 09:37:27.073 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 09:37:27 compute-0 nova_compute[250706]: 2025-12-01 09:37:27.074 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:27 compute-0 nova_compute[250706]: 2025-12-01 09:37:27.075 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 09:37:27 compute-0 nova_compute[250706]: 2025-12-01 09:37:27.093 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:27 compute-0 ceph-mon[75031]: pgmap v790: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 176 KiB/s rd, 16 KiB/s wr, 244 op/s
Dec 01 09:37:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v791: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 28 KiB/s wr, 170 op/s
Dec 01 09:37:28 compute-0 ceph-mon[75031]: pgmap v791: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 28 KiB/s wr, 170 op/s
Dec 01 09:37:29 compute-0 nova_compute[250706]: 2025-12-01 09:37:29.118 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e73 do_prune osdmap full prune enabled
Dec 01 09:37:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e74 e74: 3 total, 3 up, 3 in
Dec 01 09:37:29 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e74: 3 total, 3 up, 3 in
Dec 01 09:37:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v793: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 25 KiB/s wr, 31 op/s
Dec 01 09:37:30 compute-0 nova_compute[250706]: 2025-12-01 09:37:30.048 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e74 do_prune osdmap full prune enabled
Dec 01 09:37:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e75 e75: 3 total, 3 up, 3 in
Dec 01 09:37:30 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e75: 3 total, 3 up, 3 in
Dec 01 09:37:30 compute-0 podman[256920]: 2025-12-01 09:37:30.398441771 +0000 UTC m=+0.150803401 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 09:37:30 compute-0 ceph-mon[75031]: osdmap e74: 3 total, 3 up, 3 in
Dec 01 09:37:30 compute-0 ceph-mon[75031]: pgmap v793: 193 pgs: 193 active+clean; 41 MiB data, 148 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 25 KiB/s wr, 31 op/s
Dec 01 09:37:30 compute-0 ceph-mon[75031]: osdmap e75: 3 total, 3 up, 3 in
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.571 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.572 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquired lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.572 250710 DEBUG nova.network.neutron [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.573 250710 DEBUG nova.objects.instance [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6740b382-574d-4ced-a156-11a531b94114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 01 09:37:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v795: 193 pgs: 193 active+clean; 41 MiB data, 166 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 25 KiB/s wr, 48 op/s
Dec 01 09:37:31 compute-0 nova_compute[250706]: 2025-12-01 09:37:31.903 250710 DEBUG nova.network.neutron [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.200 250710 DEBUG nova.network.neutron [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.221 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Releasing lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.222 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.223 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.223 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.223 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.224 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.224 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.224 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.252 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.253 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.253 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.254 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.254 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:37:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1522047460' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.730 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e75 do_prune osdmap full prune enabled
Dec 01 09:37:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e76 e76: 3 total, 3 up, 3 in
Dec 01 09:37:32 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e76: 3 total, 3 up, 3 in
Dec 01 09:37:32 compute-0 ceph-mon[75031]: pgmap v795: 193 pgs: 193 active+clean; 41 MiB data, 166 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 25 KiB/s wr, 48 op/s
Dec 01 09:37:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1522047460' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.836 250710 DEBUG nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.836 250710 DEBUG nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.985 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.986 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5209MB free_disk=59.98813247680664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.986 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:32 compute-0 nova_compute[250706]: 2025-12-01 09:37:32.986 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.250 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Instance 6740b382-574d-4ced-a156-11a531b94114 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.251 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.251 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.366 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing inventories for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.459 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating ProviderTree inventory for provider 847e3dbe-0f76-4032-a374-8c965945c22f from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.460 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.479 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing aggregate associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.504 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing trait associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, traits: COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,HW_CPU_X86_AESNI,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_BMI2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 09:37:33 compute-0 nova_compute[250706]: 2025-12-01 09:37:33.550 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v797: 193 pgs: 193 active+clean; 41 MiB data, 166 MiB used, 60 GiB / 60 GiB avail; 70 KiB/s rd, 3.3 KiB/s wr, 90 op/s
Dec 01 09:37:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e76 do_prune osdmap full prune enabled
Dec 01 09:37:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e77 e77: 3 total, 3 up, 3 in
Dec 01 09:37:33 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e77: 3 total, 3 up, 3 in
Dec 01 09:37:33 compute-0 ceph-mon[75031]: osdmap e76: 3 total, 3 up, 3 in
Dec 01 09:37:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:37:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/812207976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:34 compute-0 nova_compute[250706]: 2025-12-01 09:37:34.036 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:34 compute-0 nova_compute[250706]: 2025-12-01 09:37:34.042 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:37:34 compute-0 nova_compute[250706]: 2025-12-01 09:37:34.067 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:37:34 compute-0 nova_compute[250706]: 2025-12-01 09:37:34.092 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:37:34 compute-0 nova_compute[250706]: 2025-12-01 09:37:34.093 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e77 do_prune osdmap full prune enabled
Dec 01 09:37:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e78 e78: 3 total, 3 up, 3 in
Dec 01 09:37:34 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e78: 3 total, 3 up, 3 in
Dec 01 09:37:34 compute-0 ceph-mon[75031]: pgmap v797: 193 pgs: 193 active+clean; 41 MiB data, 166 MiB used, 60 GiB / 60 GiB avail; 70 KiB/s rd, 3.3 KiB/s wr, 90 op/s
Dec 01 09:37:34 compute-0 ceph-mon[75031]: osdmap e77: 3 total, 3 up, 3 in
Dec 01 09:37:34 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/812207976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:34 compute-0 nova_compute[250706]: 2025-12-01 09:37:34.922 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:37:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e78 do_prune osdmap full prune enabled
Dec 01 09:37:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e79 e79: 3 total, 3 up, 3 in
Dec 01 09:37:35 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e79: 3 total, 3 up, 3 in
Dec 01 09:37:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v801: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 134 KiB/s rd, 8.5 KiB/s wr, 180 op/s
Dec 01 09:37:35 compute-0 ceph-mon[75031]: osdmap e78: 3 total, 3 up, 3 in
Dec 01 09:37:35 compute-0 ceph-mon[75031]: osdmap e79: 3 total, 3 up, 3 in
Dec 01 09:37:35 compute-0 podman[256992]: 2025-12-01 09:37:35.979139107 +0000 UTC m=+0.076701768 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:37:36 compute-0 nova_compute[250706]: 2025-12-01 09:37:36.949 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:36 compute-0 nova_compute[250706]: 2025-12-01 09:37:36.950 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:36 compute-0 nova_compute[250706]: 2025-12-01 09:37:36.964 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.030 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.031 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.039 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.039 250710 INFO nova.compute.claims [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Claim successful on node compute-0.ctlplane.example.com
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.169 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:37 compute-0 ceph-mon[75031]: pgmap v801: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 134 KiB/s rd, 8.5 KiB/s wr, 180 op/s
Dec 01 09:37:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:37:37 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1234859688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.629 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.636 250710 DEBUG nova.compute.provider_tree [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.659 250710 DEBUG nova.scheduler.client.report [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.687 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v802: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 74 KiB/s rd, 6.5 KiB/s wr, 102 op/s
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.688 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.751 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.751 250710 DEBUG nova.network.neutron [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.782 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.803 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.908 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.910 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.911 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Creating image(s)
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.945 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.971 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:37 compute-0 nova_compute[250706]: 2025-12-01 09:37:37.998 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.001 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "7d2050fd4f341e6a47ec44656714d34127018d9a" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.002 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "7d2050fd4f341e6a47ec44656714d34127018d9a" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:38 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1234859688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.353 250710 DEBUG nova.network.neutron [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.353 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.358 250710 DEBUG nova.virt.libvirt.imagebackend [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image locations are: [{'url': 'rbd://5620a9fb-e540-5250-a0e8-7aaad5347e3b/images/44751503-6174-45f0-a7ed-07cbb763b067/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://5620a9fb-e540-5250-a0e8-7aaad5347e3b/images/44751503-6174-45f0-a7ed-07cbb763b067/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.428 250710 DEBUG nova.virt.libvirt.imagebackend [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Selected location: {'url': 'rbd://5620a9fb-e540-5250-a0e8-7aaad5347e3b/images/44751503-6174-45f0-a7ed-07cbb763b067/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.428 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] cloning images/44751503-6174-45f0-a7ed-07cbb763b067@snap to None/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.608 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "7d2050fd4f341e6a47ec44656714d34127018d9a" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.792 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] resizing rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.897 250710 DEBUG nova.objects.instance [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'migration_context' on Instance uuid beb3fd59-b728-4e62-bc14-b171eebe8ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.918 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.918 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Ensure instance console log exists: /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.919 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.919 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.919 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.921 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='75f3695eaf1b320ff6b5ece01c175a54',container_format='bare',created_at=2025-12-01T09:37:34Z,direct_url=<?>,disk_format='raw',id=44751503-6174-45f0-a7ed-07cbb763b067,min_disk=0,min_ram=0,name='tempest-image-dependency-test-1690092867',owner='8a9d236048d24c39893cd69ad598bc1a',properties=ImageMetaProps,protected=<?>,size=1024,status='active',tags=<?>,updated_at=2025-12-01T09:37:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'disk_bus': 'virtio', 'size': 0, 'device_name': '/dev/vda', 'image_id': '44751503-6174-45f0-a7ed-07cbb763b067'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.927 250710 WARNING nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.936 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.937 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.940 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.940 250710 DEBUG nova.virt.libvirt.host [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.941 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.941 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T09:36:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dff9230f-1656-4ee2-9f6d-710f2458058e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='75f3695eaf1b320ff6b5ece01c175a54',container_format='bare',created_at=2025-12-01T09:37:34Z,direct_url=<?>,disk_format='raw',id=44751503-6174-45f0-a7ed-07cbb763b067,min_disk=0,min_ram=0,name='tempest-image-dependency-test-1690092867',owner='8a9d236048d24c39893cd69ad598bc1a',properties=ImageMetaProps,protected=<?>,size=1024,status='active',tags=<?>,updated_at=2025-12-01T09:37:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.941 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.942 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.943 250710 DEBUG nova.virt.hardware [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 01 09:37:38 compute-0 nova_compute[250706]: 2025-12-01 09:37:38.945 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:39 compute-0 ceph-mon[75031]: pgmap v802: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 74 KiB/s rd, 6.5 KiB/s wr, 102 op/s
Dec 01 09:37:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec 01 09:37:39 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3000882322' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.397 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.429 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.434 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v803: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 61 KiB/s rd, 5.3 KiB/s wr, 84 op/s
Dec 01 09:37:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Dec 01 09:37:39 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/166926032' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.849 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.851 250710 DEBUG nova.objects.instance [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'pci_devices' on Instance uuid beb3fd59-b728-4e62-bc14-b171eebe8ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.880 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] End _get_guest_xml xml=<domain type="kvm">
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <uuid>beb3fd59-b728-4e62-bc14-b171eebe8ee3</uuid>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <name>instance-00000002</name>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <memory>131072</memory>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <vcpu>1</vcpu>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <metadata>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <nova:name>instance-depend-image</nova:name>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <nova:creationTime>2025-12-01 09:37:38</nova:creationTime>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <nova:flavor name="m1.nano">
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <nova:memory>128</nova:memory>
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <nova:disk>1</nova:disk>
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <nova:swap>0</nova:swap>
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <nova:ephemeral>0</nova:ephemeral>
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <nova:vcpus>1</nova:vcpus>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       </nova:flavor>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <nova:owner>
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <nova:user uuid="14165de8e6af473c94a109257a29c50c">tempest-ImageDependencyTests-805054756-project-member</nova:user>
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <nova:project uuid="8a9d236048d24c39893cd69ad598bc1a">tempest-ImageDependencyTests-805054756</nova:project>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       </nova:owner>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <nova:root type="image" uuid="44751503-6174-45f0-a7ed-07cbb763b067"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <nova:ports/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </nova:instance>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   </metadata>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <sysinfo type="smbios">
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <system>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <entry name="manufacturer">RDO</entry>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <entry name="product">OpenStack Compute</entry>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <entry name="serial">beb3fd59-b728-4e62-bc14-b171eebe8ee3</entry>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <entry name="uuid">beb3fd59-b728-4e62-bc14-b171eebe8ee3</entry>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <entry name="family">Virtual Machine</entry>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </system>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   </sysinfo>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <os>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <boot dev="hd"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <smbios mode="sysinfo"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   </os>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <features>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <acpi/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <apic/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <vmcoreinfo/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   </features>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <clock offset="utc">
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <timer name="pit" tickpolicy="delay"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <timer name="hpet" present="no"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   </clock>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <cpu mode="host-model" match="exact">
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <topology sockets="1" cores="1" threads="1"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   </cpu>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   <devices>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <disk type="network" device="disk">
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <driver type="raw" cache="none"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <source protocol="rbd" name="vms/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk">
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <host name="192.168.122.100" port="6789"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       </source>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <auth username="openstack">
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       </auth>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <target dev="vda" bus="virtio"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <disk type="network" device="cdrom">
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <driver type="raw" cache="none"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <source protocol="rbd" name="vms/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config">
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <host name="192.168.122.100" port="6789"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       </source>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <auth username="openstack">
Dec 01 09:37:39 compute-0 nova_compute[250706]:         <secret type="ceph" uuid="5620a9fb-e540-5250-a0e8-7aaad5347e3b"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       </auth>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <target dev="sda" bus="sata"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </disk>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <serial type="pty">
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <log file="/var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/console.log" append="off"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </serial>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <video>
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <model type="virtio"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </video>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <input type="tablet" bus="usb"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <rng model="virtio">
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <backend model="random">/dev/urandom</backend>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </rng>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="pci" model="pcie-root-port"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <controller type="usb" index="0"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     <memballoon model="virtio">
Dec 01 09:37:39 compute-0 nova_compute[250706]:       <stats period="10"/>
Dec 01 09:37:39 compute-0 nova_compute[250706]:     </memballoon>
Dec 01 09:37:39 compute-0 nova_compute[250706]:   </devices>
Dec 01 09:37:39 compute-0 nova_compute[250706]: </domain>
Dec 01 09:37:39 compute-0 nova_compute[250706]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.921 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.922 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.923 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Using config drive
Dec 01 09:37:39 compute-0 nova_compute[250706]: 2025-12-01 09:37:39.950 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:40 compute-0 nova_compute[250706]: 2025-12-01 09:37:40.175 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Creating config drive at /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config
Dec 01 09:37:40 compute-0 nova_compute[250706]: 2025-12-01 09:37:40.180 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0ymohq_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e79 do_prune osdmap full prune enabled
Dec 01 09:37:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e80 e80: 3 total, 3 up, 3 in
Dec 01 09:37:40 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e80: 3 total, 3 up, 3 in
Dec 01 09:37:40 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3000882322' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:40 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/166926032' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:37:40 compute-0 ceph-mon[75031]: osdmap e80: 3 total, 3 up, 3 in
Dec 01 09:37:40 compute-0 nova_compute[250706]: 2025-12-01 09:37:40.318 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0ymohq_" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:40 compute-0 nova_compute[250706]: 2025-12-01 09:37:40.366 250710 DEBUG nova.storage.rbd_utils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] rbd image beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 01 09:37:40 compute-0 nova_compute[250706]: 2025-12-01 09:37:40.371 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:40 compute-0 nova_compute[250706]: 2025-12-01 09:37:40.545 250710 DEBUG oslo_concurrency.processutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:40 compute-0 nova_compute[250706]: 2025-12-01 09:37:40.547 250710 INFO nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deleting local config drive /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3/disk.config because it was imported into RBD.
Dec 01 09:37:40 compute-0 systemd-machined[212908]: New machine qemu-2-instance-00000002.
Dec 01 09:37:40 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 01 09:37:41 compute-0 ceph-mon[75031]: pgmap v803: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 61 KiB/s rd, 5.3 KiB/s wr, 84 op/s
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.398 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581861.3982816, beb3fd59-b728-4e62-bc14-b171eebe8ee3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.399 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] VM Resumed (Lifecycle Event)
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.402 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.402 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.406 250710 INFO nova.virt.libvirt.driver [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance spawned successfully.
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.406 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.428 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.434 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.436 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.437 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.437 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.437 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.438 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.438 250710 DEBUG nova.virt.libvirt.driver [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.474 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.474 250710 DEBUG nova.virt.driver [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] Emitting event <LifecycleEvent: 1764581861.3993945, beb3fd59-b728-4e62-bc14-b171eebe8ee3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.475 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] VM Started (Lifecycle Event)
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.501 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.505 250710 DEBUG nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.510 250710 INFO nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 3.60 seconds to spawn the instance on the hypervisor.
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.510 250710 DEBUG nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.531 250710 INFO nova.compute.manager [None req-1fa43d2e-21f0-416b-beac-16c40a92f33f - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.566 250710 INFO nova.compute.manager [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 4.56 seconds to build instance.
Dec 01 09:37:41 compute-0 nova_compute[250706]: 2025-12-01 09:37:41.582 250710 DEBUG oslo_concurrency.lockutils [None req-b3c53370-064f-4b2d-826c-decd61f2e05a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v805: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 5.5 KiB/s wr, 109 op/s
Dec 01 09:37:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:37:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:37:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:37:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:37:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:37:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:37:43 compute-0 ceph-mon[75031]: pgmap v805: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 5.5 KiB/s wr, 109 op/s
Dec 01 09:37:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v806: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 22 KiB/s wr, 107 op/s
Dec 01 09:37:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:37:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1566923500' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:37:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:37:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1566923500' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:37:44 compute-0 nova_compute[250706]: 2025-12-01 09:37:44.921 250710 DEBUG nova.compute.manager [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:37:44 compute-0 nova_compute[250706]: 2025-12-01 09:37:44.961 250710 INFO nova.compute.manager [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] instance snapshotting
Dec 01 09:37:45 compute-0 nova_compute[250706]: 2025-12-01 09:37:45.152 250710 INFO nova.virt.libvirt.driver [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Beginning live snapshot process
Dec 01 09:37:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:45 compute-0 ceph-mon[75031]: pgmap v806: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 22 KiB/s wr, 107 op/s
Dec 01 09:37:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1566923500' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:37:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1566923500' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:37:45 compute-0 nova_compute[250706]: 2025-12-01 09:37:45.355 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] creating snapshot(0e245d27dde741ffa3dc5af7777eec0d) on rbd image(beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 01 09:37:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v807: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 17 KiB/s wr, 75 op/s
Dec 01 09:37:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e80 do_prune osdmap full prune enabled
Dec 01 09:37:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e81 e81: 3 total, 3 up, 3 in
Dec 01 09:37:46 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e81: 3 total, 3 up, 3 in
Dec 01 09:37:46 compute-0 nova_compute[250706]: 2025-12-01 09:37:46.370 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] cloning vms/beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk@0e245d27dde741ffa3dc5af7777eec0d to images/e7cfd47e-36b4-4753-ba43-b81de92dca95 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 01 09:37:46 compute-0 nova_compute[250706]: 2025-12-01 09:37:46.498 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] flattening images/e7cfd47e-36b4-4753-ba43-b81de92dca95 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Dec 01 09:37:46 compute-0 nova_compute[250706]: 2025-12-01 09:37:46.664 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] removing snapshot(0e245d27dde741ffa3dc5af7777eec0d) on rbd image(beb3fd59-b728-4e62-bc14-b171eebe8ee3_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Dec 01 09:37:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e81 do_prune osdmap full prune enabled
Dec 01 09:37:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e82 e82: 3 total, 3 up, 3 in
Dec 01 09:37:47 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e82: 3 total, 3 up, 3 in
Dec 01 09:37:47 compute-0 ceph-mon[75031]: pgmap v807: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 17 KiB/s wr, 75 op/s
Dec 01 09:37:47 compute-0 ceph-mon[75031]: osdmap e81: 3 total, 3 up, 3 in
Dec 01 09:37:47 compute-0 nova_compute[250706]: 2025-12-01 09:37:47.357 250710 DEBUG nova.storage.rbd_utils [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] creating snapshot(snap) on rbd image(e7cfd47e-36b4-4753-ba43-b81de92dca95) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 01 09:37:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v810: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 99 KiB/s rd, 24 KiB/s wr, 123 op/s
Dec 01 09:37:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e82 do_prune osdmap full prune enabled
Dec 01 09:37:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e83 e83: 3 total, 3 up, 3 in
Dec 01 09:37:48 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e83: 3 total, 3 up, 3 in
Dec 01 09:37:48 compute-0 ceph-mon[75031]: osdmap e82: 3 total, 3 up, 3 in
Dec 01 09:37:49 compute-0 ceph-mon[75031]: pgmap v810: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 99 KiB/s rd, 24 KiB/s wr, 123 op/s
Dec 01 09:37:49 compute-0 ceph-mon[75031]: osdmap e83: 3 total, 3 up, 3 in
Dec 01 09:37:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v812: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 65 KiB/s rd, 3.8 KiB/s wr, 83 op/s
Dec 01 09:37:49 compute-0 nova_compute[250706]: 2025-12-01 09:37:49.801 250710 INFO nova.virt.libvirt.driver [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Snapshot image upload complete
Dec 01 09:37:49 compute-0 nova_compute[250706]: 2025-12-01 09:37:49.801 250710 INFO nova.compute.manager [None req-e07427d1-f157-423e-9881-bc9bd03c99c7 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 4.84 seconds to snapshot the instance on the hypervisor.
Dec 01 09:37:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:51 compute-0 ceph-mon[75031]: pgmap v812: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 65 KiB/s rd, 3.8 KiB/s wr, 83 op/s
Dec 01 09:37:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v813: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 4.5 KiB/s wr, 106 op/s
Dec 01 09:37:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e83 do_prune osdmap full prune enabled
Dec 01 09:37:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e84 e84: 3 total, 3 up, 3 in
Dec 01 09:37:52 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e84: 3 total, 3 up, 3 in
Dec 01 09:37:52 compute-0 ceph-mon[75031]: pgmap v813: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 82 KiB/s rd, 4.5 KiB/s wr, 106 op/s
Dec 01 09:37:52 compute-0 ceph-mon[75031]: osdmap e84: 3 total, 3 up, 3 in
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.133 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.133 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.133 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.134 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.134 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.135 250710 INFO nova.compute.manager [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Terminating instance
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.136 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "refresh_cache-beb3fd59-b728-4e62-bc14-b171eebe8ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.137 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquired lock "refresh_cache-beb3fd59-b728-4e62-bc14-b171eebe8ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.137 250710 DEBUG nova.network.neutron [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.308 250710 DEBUG nova.network.neutron [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.641 250710 DEBUG nova.network.neutron [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.656 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Releasing lock "refresh_cache-beb3fd59-b728-4e62-bc14-b171eebe8ee3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.657 250710 DEBUG nova.compute.manager [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 01 09:37:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v815: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 115 KiB/s rd, 5.6 KiB/s wr, 147 op/s
Dec 01 09:37:53 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 01 09:37:53 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 1.318s CPU time.
Dec 01 09:37:53 compute-0 systemd-machined[212908]: Machine qemu-2-instance-00000002 terminated.
Dec 01 09:37:53 compute-0 podman[257546]: 2025-12-01 09:37:53.797193986 +0000 UTC m=+0.066427482 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.883 250710 INFO nova.virt.libvirt.driver [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance destroyed successfully.
Dec 01 09:37:53 compute-0 nova_compute[250706]: 2025-12-01 09:37:53.884 250710 DEBUG nova.objects.instance [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'resources' on Instance uuid beb3fd59-b728-4e62-bc14-b171eebe8ee3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 01 09:37:54 compute-0 ceph-mon[75031]: pgmap v815: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 115 KiB/s rd, 5.6 KiB/s wr, 147 op/s
Dec 01 09:37:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:37:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e84 do_prune osdmap full prune enabled
Dec 01 09:37:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e85 e85: 3 total, 3 up, 3 in
Dec 01 09:37:55 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e85: 3 total, 3 up, 3 in
Dec 01 09:37:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v817: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 88 KiB/s rd, 4.1 KiB/s wr, 114 op/s
Dec 01 09:37:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e85 do_prune osdmap full prune enabled
Dec 01 09:37:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e86 e86: 3 total, 3 up, 3 in
Dec 01 09:37:56 compute-0 ceph-mon[75031]: osdmap e85: 3 total, 3 up, 3 in
Dec 01 09:37:56 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e86: 3 total, 3 up, 3 in
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.584 250710 INFO nova.virt.libvirt.driver [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deleting instance files /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3_del
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.584 250710 INFO nova.virt.libvirt.driver [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deletion of /var/lib/nova/instances/beb3fd59-b728-4e62-bc14-b171eebe8ee3_del complete
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.742 250710 DEBUG nova.virt.libvirt.host [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.743 250710 INFO nova.virt.libvirt.host [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] UEFI support detected
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.745 250710 INFO nova.compute.manager [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 3.09 seconds to destroy the instance on the hypervisor.
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.746 250710 DEBUG oslo.service.loopingcall [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.746 250710 DEBUG nova.compute.manager [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 01 09:37:56 compute-0 nova_compute[250706]: 2025-12-01 09:37:56.746 250710 DEBUG nova.network.neutron [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.017 250710 DEBUG nova.network.neutron [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.032 250710 DEBUG nova.network.neutron [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.044 250710 INFO nova.compute.manager [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Took 0.30 seconds to deallocate network for instance.
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.092 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.093 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.163 250710 DEBUG oslo_concurrency.processutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:37:57 compute-0 ceph-mon[75031]: pgmap v817: 193 pgs: 193 active+clean; 42 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 88 KiB/s rd, 4.1 KiB/s wr, 114 op/s
Dec 01 09:37:57 compute-0 ceph-mon[75031]: osdmap e86: 3 total, 3 up, 3 in
Dec 01 09:37:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:37:57 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2334927344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.648 250710 DEBUG oslo_concurrency.processutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.654 250710 DEBUG nova.compute.provider_tree [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.682 250710 DEBUG nova.scheduler.client.report [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:37:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v819: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 100 KiB/s rd, 5.5 KiB/s wr, 133 op/s
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.705 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.740 250710 INFO nova.scheduler.client.report [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Deleted allocations for instance beb3fd59-b728-4e62-bc14-b171eebe8ee3
Dec 01 09:37:57 compute-0 nova_compute[250706]: 2025-12-01 09:37:57.822 250710 DEBUG oslo_concurrency.lockutils [None req-6db06074-ff16-4a71-a814-b3585bc4178a 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "beb3fd59-b728-4e62-bc14-b171eebe8ee3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.376 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "6740b382-574d-4ced-a156-11a531b94114" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.377 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.377 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "6740b382-574d-4ced-a156-11a531b94114-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.377 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.378 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.380 250710 INFO nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Terminating instance
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.381 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.381 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquired lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 09:37:58 compute-0 nova_compute[250706]: 2025-12-01 09:37:58.381 250710 DEBUG nova.network.neutron [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 01 09:37:58 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2334927344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.023 250710 DEBUG nova.network.neutron [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.246 250710 DEBUG nova.network.neutron [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.264 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Releasing lock "refresh_cache-6740b382-574d-4ced-a156-11a531b94114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.265 250710 DEBUG nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 01 09:37:59 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 01 09:37:59 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 1.321s CPU time.
Dec 01 09:37:59 compute-0 systemd-machined[212908]: Machine qemu-1-instance-00000001 terminated.
Dec 01 09:37:59 compute-0 ceph-mon[75031]: pgmap v819: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 100 KiB/s rd, 5.5 KiB/s wr, 133 op/s
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.489 250710 INFO nova.virt.libvirt.driver [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance destroyed successfully.
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.490 250710 DEBUG nova.objects.instance [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lazy-loading 'resources' on Instance uuid 6740b382-574d-4ced-a156-11a531b94114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 01 09:37:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v820: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 4.6 KiB/s wr, 110 op/s
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.716 250710 INFO nova.virt.libvirt.driver [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deleting instance files /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114_del
Dec 01 09:37:59 compute-0 nova_compute[250706]: 2025-12-01 09:37:59.717 250710 INFO nova.virt.libvirt.driver [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deletion of /var/lib/nova/instances/6740b382-574d-4ced-a156-11a531b94114_del complete
Dec 01 09:38:00 compute-0 nova_compute[250706]: 2025-12-01 09:38:00.006 250710 INFO nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 0.74 seconds to destroy the instance on the hypervisor.
Dec 01 09:38:00 compute-0 nova_compute[250706]: 2025-12-01 09:38:00.007 250710 DEBUG oslo.service.loopingcall [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 01 09:38:00 compute-0 nova_compute[250706]: 2025-12-01 09:38:00.007 250710 DEBUG nova.compute.manager [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 01 09:38:00 compute-0 nova_compute[250706]: 2025-12-01 09:38:00.008 250710 DEBUG nova.network.neutron [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 01 09:38:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e86 do_prune osdmap full prune enabled
Dec 01 09:38:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e87 e87: 3 total, 3 up, 3 in
Dec 01 09:38:00 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e87: 3 total, 3 up, 3 in
Dec 01 09:38:01 compute-0 podman[257634]: 2025-12-01 09:38:01.002677407 +0000 UTC m=+0.100901415 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.015 250710 DEBUG nova.network.neutron [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.042 250710 DEBUG nova.network.neutron [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.058 250710 INFO nova.compute.manager [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 1.05 seconds to deallocate network for instance.
Dec 01 09:38:01 compute-0 ceph-mon[75031]: pgmap v820: 193 pgs: 193 active+clean; 41 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 4.6 KiB/s wr, 110 op/s
Dec 01 09:38:01 compute-0 ceph-mon[75031]: osdmap e87: 3 total, 3 up, 3 in
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.347 250710 INFO nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Took 0.29 seconds to detach 1 volumes for instance.
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.348 250710 DEBUG nova.compute.manager [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] [instance: 6740b382-574d-4ced-a156-11a531b94114] Deleting volume: 61c9bc29-5cbf-4816-a0ae-b24ddf88776c _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.517 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.517 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.568 250710 DEBUG oslo_concurrency.processutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:38:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v822: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 3.4 KiB/s wr, 104 op/s
Dec 01 09:38:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:38:01 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1145948787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:38:01 compute-0 nova_compute[250706]: 2025-12-01 09:38:01.996 250710 DEBUG oslo_concurrency.processutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:38:02 compute-0 nova_compute[250706]: 2025-12-01 09:38:02.002 250710 DEBUG nova.compute.provider_tree [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:38:02 compute-0 nova_compute[250706]: 2025-12-01 09:38:02.018 250710 DEBUG nova.scheduler.client.report [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:38:02 compute-0 nova_compute[250706]: 2025-12-01 09:38:02.037 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:38:02 compute-0 nova_compute[250706]: 2025-12-01 09:38:02.069 250710 INFO nova.scheduler.client.report [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Deleted allocations for instance 6740b382-574d-4ced-a156-11a531b94114
Dec 01 09:38:02 compute-0 nova_compute[250706]: 2025-12-01 09:38:02.157 250710 DEBUG oslo_concurrency.lockutils [None req-d244ac64-ba4c-49eb-85c1-3a6c28ed0c9c 14165de8e6af473c94a109257a29c50c 8a9d236048d24c39893cd69ad598bc1a - - default default] Lock "6740b382-574d-4ced-a156-11a531b94114" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:38:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e87 do_prune osdmap full prune enabled
Dec 01 09:38:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e88 e88: 3 total, 3 up, 3 in
Dec 01 09:38:02 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1145948787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:38:02 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e88: 3 total, 3 up, 3 in
Dec 01 09:38:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:38:02 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3430721002' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:38:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:38:02 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3430721002' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:38:03 compute-0 ceph-mon[75031]: pgmap v822: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 3.4 KiB/s wr, 104 op/s
Dec 01 09:38:03 compute-0 ceph-mon[75031]: osdmap e88: 3 total, 3 up, 3 in
Dec 01 09:38:03 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/3430721002' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:38:03 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/3430721002' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:38:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v824: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 66 KiB/s rd, 3.4 KiB/s wr, 88 op/s
Dec 01 09:38:05 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:38:05.214 159899 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:9e:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '66:a0:73:58:3b:fd'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 09:38:05 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:38:05.215 159899 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 09:38:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e88 do_prune osdmap full prune enabled
Dec 01 09:38:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e89 e89: 3 total, 3 up, 3 in
Dec 01 09:38:05 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e89: 3 total, 3 up, 3 in
Dec 01 09:38:05 compute-0 ceph-mon[75031]: pgmap v824: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 66 KiB/s rd, 3.4 KiB/s wr, 88 op/s
Dec 01 09:38:05 compute-0 ceph-mon[75031]: osdmap e89: 3 total, 3 up, 3 in
Dec 01 09:38:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v826: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 77 KiB/s rd, 3.0 KiB/s wr, 100 op/s
Dec 01 09:38:06 compute-0 podman[257682]: 2025-12-01 09:38:06.960122605 +0000 UTC m=+0.061630684 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:38:07 compute-0 ceph-mon[75031]: pgmap v826: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 77 KiB/s rd, 3.0 KiB/s wr, 100 op/s
Dec 01 09:38:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v827: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 47 KiB/s rd, 1.9 KiB/s wr, 60 op/s
Dec 01 09:38:08 compute-0 nova_compute[250706]: 2025-12-01 09:38:08.881 250710 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764581873.87877, beb3fd59-b728-4e62-bc14-b171eebe8ee3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 01 09:38:08 compute-0 nova_compute[250706]: 2025-12-01 09:38:08.881 250710 INFO nova.compute.manager [-] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] VM Stopped (Lifecycle Event)
Dec 01 09:38:08 compute-0 nova_compute[250706]: 2025-12-01 09:38:08.904 250710 DEBUG nova.compute.manager [None req-91768fd3-3176-4f1a-8355-cb6b0dbe4f56 - - - - - -] [instance: beb3fd59-b728-4e62-bc14-b171eebe8ee3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:38:09 compute-0 ceph-mon[75031]: pgmap v827: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 47 KiB/s rd, 1.9 KiB/s wr, 60 op/s
Dec 01 09:38:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v828: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 1.2 KiB/s wr, 46 op/s
Dec 01 09:38:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e89 do_prune osdmap full prune enabled
Dec 01 09:38:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 e90: 3 total, 3 up, 3 in
Dec 01 09:38:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v829: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.1 KiB/s wr, 39 op/s
Dec 01 09:38:12 compute-0 ceph-mon[75031]: log_channel(cluster) log [DBG] : osdmap e90: 3 total, 3 up, 3 in
Dec 01 09:38:12 compute-0 ceph-mon[75031]: pgmap v828: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 1.2 KiB/s wr, 46 op/s
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:38:13
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'vms', 'cephfs.cephfs.data', 'cephfs.cephfs.meta', 'images', 'volumes', '.mgr']
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:38:13 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:38:13.217 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:38:13 compute-0 ceph-mon[75031]: pgmap v829: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.1 KiB/s wr, 39 op/s
Dec 01 09:38:13 compute-0 ceph-mon[75031]: osdmap e90: 3 total, 3 up, 3 in
Dec 01 09:38:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v831: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 0 B/s wr, 16 op/s
Dec 01 09:38:14 compute-0 nova_compute[250706]: 2025-12-01 09:38:14.487 250710 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764581879.4857285, 6740b382-574d-4ced-a156-11a531b94114 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 01 09:38:14 compute-0 nova_compute[250706]: 2025-12-01 09:38:14.487 250710 INFO nova.compute.manager [-] [instance: 6740b382-574d-4ced-a156-11a531b94114] VM Stopped (Lifecycle Event)
Dec 01 09:38:14 compute-0 nova_compute[250706]: 2025-12-01 09:38:14.511 250710 DEBUG nova.compute.manager [None req-ab855892-ca39-4092-916a-99c470d0237e - - - - - -] [instance: 6740b382-574d-4ced-a156-11a531b94114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 01 09:38:14 compute-0 ceph-mon[75031]: pgmap v831: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 0 B/s wr, 16 op/s
Dec 01 09:38:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v832: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:16 compute-0 ceph-mon[75031]: pgmap v832: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v833: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:38:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:38:18 compute-0 ceph-mon[75031]: pgmap v833: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v834: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:38:20.476 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:38:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:38:20.477 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:38:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:38:20.477 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:38:20 compute-0 ceph-mon[75031]: pgmap v834: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:21 compute-0 sudo[257702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:21 compute-0 sudo[257702]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:21 compute-0 sudo[257702]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:21 compute-0 sudo[257727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:38:21 compute-0 sudo[257727]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:21 compute-0 sudo[257727]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:21 compute-0 sudo[257752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:21 compute-0 sudo[257752]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:21 compute-0 sudo[257752]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:21 compute-0 sudo[257777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:38:21 compute-0 sudo[257777]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v835: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:21 compute-0 sudo[257777]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:38:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:38:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:38:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:38:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:38:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:38:21 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 45a19509-b60b-41ae-82a2-07fa38b73aa0 does not exist
Dec 01 09:38:21 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 2153b517-35de-4c09-b96b-713702e40265 does not exist
Dec 01 09:38:21 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev b2bec11f-8d77-4b04-99b2-bf9acac9e413 does not exist
Dec 01 09:38:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:38:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:38:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:38:22 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:38:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:38:22 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:38:22 compute-0 sudo[257833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:22 compute-0 sudo[257833]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:22 compute-0 sudo[257833]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:22 compute-0 sudo[257858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:38:22 compute-0 sudo[257858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:22 compute-0 sudo[257858]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:22 compute-0 sudo[257883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:22 compute-0 sudo[257883]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:22 compute-0 sudo[257883]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:22 compute-0 sudo[257908]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:38:22 compute-0 sudo[257908]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:22 compute-0 podman[257974]: 2025-12-01 09:38:22.522513418 +0000 UTC m=+0.047592541 container create 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True)
Dec 01 09:38:22 compute-0 systemd[1]: Started libpod-conmon-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope.
Dec 01 09:38:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:38:22 compute-0 podman[257974]: 2025-12-01 09:38:22.501620777 +0000 UTC m=+0.026699960 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:38:22 compute-0 podman[257974]: 2025-12-01 09:38:22.613564158 +0000 UTC m=+0.138643361 container init 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:38:22 compute-0 podman[257974]: 2025-12-01 09:38:22.625493971 +0000 UTC m=+0.150573094 container start 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Dec 01 09:38:22 compute-0 podman[257974]: 2025-12-01 09:38:22.628851258 +0000 UTC m=+0.153930421 container attach 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:38:22 compute-0 festive_tu[257990]: 167 167
Dec 01 09:38:22 compute-0 systemd[1]: libpod-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope: Deactivated successfully.
Dec 01 09:38:22 compute-0 conmon[257990]: conmon 579ade348bfcece26a0b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope/container/memory.events
Dec 01 09:38:22 compute-0 podman[257974]: 2025-12-01 09:38:22.633950395 +0000 UTC m=+0.159029548 container died 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:38:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-875d6f89812e96f143511a8046d625095791841c219678d56f90473a1a69ff5a-merged.mount: Deactivated successfully.
Dec 01 09:38:22 compute-0 podman[257974]: 2025-12-01 09:38:22.681948366 +0000 UTC m=+0.207027519 container remove 579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=festive_tu, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:38:22 compute-0 systemd[1]: libpod-conmon-579ade348bfcece26a0bd7054862c501c0d6038f04979cdb73e5d29e3a5a673b.scope: Deactivated successfully.
Dec 01 09:38:22 compute-0 ceph-mon[75031]: pgmap v835: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:38:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:38:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:38:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:38:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:38:22 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:38:22 compute-0 podman[258013]: 2025-12-01 09:38:22.873558959 +0000 UTC m=+0.061516051 container create e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:38:22 compute-0 systemd[1]: Started libpod-conmon-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope.
Dec 01 09:38:22 compute-0 podman[258013]: 2025-12-01 09:38:22.842243968 +0000 UTC m=+0.030201080 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:38:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:38:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:22 compute-0 podman[258013]: 2025-12-01 09:38:22.978947542 +0000 UTC m=+0.166904634 container init e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Dec 01 09:38:22 compute-0 podman[258013]: 2025-12-01 09:38:22.987531969 +0000 UTC m=+0.175489061 container start e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:38:22 compute-0 podman[258013]: 2025-12-01 09:38:22.991474572 +0000 UTC m=+0.179431674 container attach e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:38:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v836: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:23 compute-0 podman[258043]: 2025-12-01 09:38:23.974107207 +0000 UTC m=+0.068376278 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 09:38:24 compute-0 hopeful_edison[258030]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:38:24 compute-0 hopeful_edison[258030]: --> relative data size: 1.0
Dec 01 09:38:24 compute-0 hopeful_edison[258030]: --> All data devices are unavailable
Dec 01 09:38:24 compute-0 systemd[1]: libpod-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope: Deactivated successfully.
Dec 01 09:38:24 compute-0 systemd[1]: libpod-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope: Consumed 1.136s CPU time.
Dec 01 09:38:24 compute-0 podman[258013]: 2025-12-01 09:38:24.166527324 +0000 UTC m=+1.354484396 container died e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:38:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c9957dbdf25e10d7611572038980a73c89e31aeadcf5277d253ab94582cb2be-merged.mount: Deactivated successfully.
Dec 01 09:38:24 compute-0 podman[258013]: 2025-12-01 09:38:24.254115575 +0000 UTC m=+1.442072637 container remove e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_edison, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:38:24 compute-0 systemd[1]: libpod-conmon-e2a6782e7c05ed3f1259c04ba2c53d6b9afe6527bf6333f73b3922cd4c5bcafc.scope: Deactivated successfully.
Dec 01 09:38:24 compute-0 sudo[257908]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:24 compute-0 sudo[258095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:24 compute-0 sudo[258095]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:24 compute-0 sudo[258095]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:24 compute-0 sudo[258120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:38:24 compute-0 sudo[258120]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:24 compute-0 sudo[258120]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:24 compute-0 sudo[258145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:24 compute-0 sudo[258145]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:24 compute-0 sudo[258145]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:24 compute-0 sudo[258170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:38:24 compute-0 sudo[258170]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:24 compute-0 podman[258236]: 2025-12-01 09:38:24.837202933 +0000 UTC m=+0.036281345 container create f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:38:24 compute-0 systemd[1]: Started libpod-conmon-f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8.scope.
Dec 01 09:38:24 compute-0 ceph-mon[75031]: pgmap v836: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:24 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:38:24 compute-0 podman[258236]: 2025-12-01 09:38:24.82318246 +0000 UTC m=+0.022260892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:38:24 compute-0 podman[258236]: 2025-12-01 09:38:24.921106418 +0000 UTC m=+0.120184870 container init f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:38:24 compute-0 podman[258236]: 2025-12-01 09:38:24.927152732 +0000 UTC m=+0.126231144 container start f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:38:24 compute-0 podman[258236]: 2025-12-01 09:38:24.930067036 +0000 UTC m=+0.129145488 container attach f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:38:24 compute-0 busy_lamarr[258252]: 167 167
Dec 01 09:38:24 compute-0 systemd[1]: libpod-f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8.scope: Deactivated successfully.
Dec 01 09:38:24 compute-0 podman[258257]: 2025-12-01 09:38:24.981872846 +0000 UTC m=+0.031547068 container died f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef)
Dec 01 09:38:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-4f2249abd3b67e0344220ab72cd22c89450dfe486f9b67d6d1c0cdb050ba3fac-merged.mount: Deactivated successfully.
Dec 01 09:38:25 compute-0 podman[258257]: 2025-12-01 09:38:25.01882559 +0000 UTC m=+0.068499742 container remove f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=busy_lamarr, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:38:25 compute-0 systemd[1]: libpod-conmon-f7f3ad2c11da596c5363355f7771bf574ab50c90c135531d4b9e3024ae2a86c8.scope: Deactivated successfully.
Dec 01 09:38:25 compute-0 podman[258279]: 2025-12-01 09:38:25.277488563 +0000 UTC m=+0.062001645 container create 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:38:25 compute-0 systemd[1]: Started libpod-conmon-05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd.scope.
Dec 01 09:38:25 compute-0 podman[258279]: 2025-12-01 09:38:25.253014499 +0000 UTC m=+0.037527591 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:38:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:38:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:25 compute-0 podman[258279]: 2025-12-01 09:38:25.374931297 +0000 UTC m=+0.159444359 container init 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:38:25 compute-0 podman[258279]: 2025-12-01 09:38:25.38927334 +0000 UTC m=+0.173786422 container start 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:38:25 compute-0 podman[258279]: 2025-12-01 09:38:25.394662935 +0000 UTC m=+0.179175997 container attach 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:38:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v837: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:26 compute-0 thirsty_elion[258296]: {
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:     "0": [
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:         {
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "devices": [
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "/dev/loop3"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             ],
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_name": "ceph_lv0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_size": "21470642176",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "name": "ceph_lv0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "tags": {
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cluster_name": "ceph",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.crush_device_class": "",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.encrypted": "0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osd_id": "0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.type": "block",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.vdo": "0"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             },
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "type": "block",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "vg_name": "ceph_vg0"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:         }
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:     ],
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:     "1": [
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:         {
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "devices": [
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "/dev/loop4"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             ],
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_name": "ceph_lv1",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_size": "21470642176",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "name": "ceph_lv1",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "tags": {
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cluster_name": "ceph",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.crush_device_class": "",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.encrypted": "0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osd_id": "1",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.type": "block",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.vdo": "0"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             },
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "type": "block",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "vg_name": "ceph_vg1"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:         }
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:     ],
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:     "2": [
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:         {
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "devices": [
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "/dev/loop5"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             ],
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_name": "ceph_lv2",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_size": "21470642176",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "name": "ceph_lv2",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "tags": {
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.cluster_name": "ceph",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.crush_device_class": "",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.encrypted": "0",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osd_id": "2",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.type": "block",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:                 "ceph.vdo": "0"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             },
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "type": "block",
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:             "vg_name": "ceph_vg2"
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:         }
Dec 01 09:38:26 compute-0 thirsty_elion[258296]:     ]
Dec 01 09:38:26 compute-0 thirsty_elion[258296]: }
Dec 01 09:38:26 compute-0 podman[258279]: 2025-12-01 09:38:26.207269788 +0000 UTC m=+0.991782840 container died 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:38:26 compute-0 systemd[1]: libpod-05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd.scope: Deactivated successfully.
Dec 01 09:38:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-b9fdedb1ac4d0d28a148667f2301870f5158812e506266e4df35834dbf857a3d-merged.mount: Deactivated successfully.
Dec 01 09:38:26 compute-0 podman[258279]: 2025-12-01 09:38:26.389160922 +0000 UTC m=+1.173673964 container remove 05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_elion, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:38:26 compute-0 systemd[1]: libpod-conmon-05d6f54f7d1575068129aab4a366c3e335587191d2efb167853702a5f6db30cd.scope: Deactivated successfully.
Dec 01 09:38:26 compute-0 sudo[258170]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:26 compute-0 sudo[258319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:26 compute-0 sudo[258319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:26 compute-0 sudo[258319]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:26 compute-0 sudo[258344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:38:26 compute-0 sudo[258344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:26 compute-0 sudo[258344]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:26 compute-0 sudo[258369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:26 compute-0 sudo[258369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:26 compute-0 sudo[258369]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:26 compute-0 sudo[258394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:38:26 compute-0 sudo[258394]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:26 compute-0 ceph-mon[75031]: pgmap v837: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:27 compute-0 podman[258461]: 2025-12-01 09:38:27.021503777 +0000 UTC m=+0.042807993 container create da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:38:27 compute-0 systemd[1]: Started libpod-conmon-da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e.scope.
Dec 01 09:38:27 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:38:27 compute-0 podman[258461]: 2025-12-01 09:38:27.000924114 +0000 UTC m=+0.022228330 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:38:27 compute-0 podman[258461]: 2025-12-01 09:38:27.108755617 +0000 UTC m=+0.130059803 container init da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:38:27 compute-0 podman[258461]: 2025-12-01 09:38:27.114010819 +0000 UTC m=+0.135315005 container start da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 01 09:38:27 compute-0 podman[258461]: 2025-12-01 09:38:27.117093027 +0000 UTC m=+0.138397213 container attach da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:38:27 compute-0 confident_mclean[258478]: 167 167
Dec 01 09:38:27 compute-0 systemd[1]: libpod-da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e.scope: Deactivated successfully.
Dec 01 09:38:27 compute-0 podman[258461]: 2025-12-01 09:38:27.123309046 +0000 UTC m=+0.144613242 container died da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:38:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-758cce2a7a3502b74cc8441f35cf0713ee36b985abf2f369098f07c3aee90dbf-merged.mount: Deactivated successfully.
Dec 01 09:38:27 compute-0 podman[258461]: 2025-12-01 09:38:27.162125523 +0000 UTC m=+0.183429729 container remove da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_mclean, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:38:27 compute-0 systemd[1]: libpod-conmon-da4ce42781b43a94db4b0aa28e96cf69b6baf0e92fae16df43ce305f510beb3e.scope: Deactivated successfully.
Dec 01 09:38:27 compute-0 podman[258500]: 2025-12-01 09:38:27.312663425 +0000 UTC m=+0.043912085 container create 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:38:27 compute-0 systemd[1]: Started libpod-conmon-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope.
Dec 01 09:38:27 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:38:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:38:27 compute-0 podman[258500]: 2025-12-01 09:38:27.374313739 +0000 UTC m=+0.105562419 container init 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:38:27 compute-0 podman[258500]: 2025-12-01 09:38:27.384752469 +0000 UTC m=+0.116001119 container start 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:38:27 compute-0 podman[258500]: 2025-12-01 09:38:27.291758143 +0000 UTC m=+0.023006833 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:38:27 compute-0 podman[258500]: 2025-12-01 09:38:27.387862319 +0000 UTC m=+0.119110999 container attach 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:38:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v838: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:28 compute-0 fervent_khorana[258517]: {
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "osd_id": 0,
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "type": "bluestore"
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:     },
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "osd_id": 1,
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "type": "bluestore"
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:     },
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "osd_id": 2,
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:         "type": "bluestore"
Dec 01 09:38:28 compute-0 fervent_khorana[258517]:     }
Dec 01 09:38:28 compute-0 fervent_khorana[258517]: }
Dec 01 09:38:28 compute-0 systemd[1]: libpod-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope: Deactivated successfully.
Dec 01 09:38:28 compute-0 systemd[1]: libpod-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope: Consumed 1.007s CPU time.
Dec 01 09:38:28 compute-0 podman[258550]: 2025-12-01 09:38:28.427067672 +0000 UTC m=+0.023234579 container died 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:38:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-2980e476e6a020ddcd40acf0843a19254d23a08a466482728ececbcd9ffa3e49-merged.mount: Deactivated successfully.
Dec 01 09:38:28 compute-0 podman[258550]: 2025-12-01 09:38:28.47668456 +0000 UTC m=+0.072851447 container remove 0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_khorana, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Dec 01 09:38:28 compute-0 systemd[1]: libpod-conmon-0eb5e49bb718f4944b2e78e2091212e27a1f2628a550a2392a59e848170ffd71.scope: Deactivated successfully.
Dec 01 09:38:28 compute-0 sudo[258394]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:38:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:38:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:38:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:38:28 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 3af4ca4f-049b-44c7-be1c-87cc86e7c9e4 does not exist
Dec 01 09:38:28 compute-0 sudo[258565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:38:28 compute-0 sudo[258565]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:28 compute-0 sudo[258565]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:28 compute-0 sudo[258590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:38:28 compute-0 sudo[258590]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:38:28 compute-0 sudo[258590]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:28 compute-0 ceph-mon[75031]: pgmap v838: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:38:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:38:29 compute-0 nova_compute[250706]: 2025-12-01 09:38:29.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v839: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:30 compute-0 ceph-mon[75031]: pgmap v839: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:31 compute-0 nova_compute[250706]: 2025-12-01 09:38:31.048 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v840: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:31 compute-0 podman[258616]: 2025-12-01 09:38:31.989970325 +0000 UTC m=+0.094257713 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:38:32 compute-0 nova_compute[250706]: 2025-12-01 09:38:32.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:32 compute-0 nova_compute[250706]: 2025-12-01 09:38:32.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:38:32 compute-0 nova_compute[250706]: 2025-12-01 09:38:32.055 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:38:32 compute-0 nova_compute[250706]: 2025-12-01 09:38:32.103 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:38:32 compute-0 nova_compute[250706]: 2025-12-01 09:38:32.104 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:32 compute-0 ceph-mon[75031]: pgmap v840: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.083 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.084 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.084 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.084 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.085 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:38:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:38:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3040711938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.572 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:38:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v841: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.731 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.732 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5216MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.733 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.733 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.815 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.815 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:38:33 compute-0 nova_compute[250706]: 2025-12-01 09:38:33.832 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:38:33 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3040711938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:38:34 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:38:34 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3455265482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:38:34 compute-0 nova_compute[250706]: 2025-12-01 09:38:34.247 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:38:34 compute-0 nova_compute[250706]: 2025-12-01 09:38:34.253 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:38:34 compute-0 nova_compute[250706]: 2025-12-01 09:38:34.280 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:38:34 compute-0 nova_compute[250706]: 2025-12-01 09:38:34.309 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:38:34 compute-0 nova_compute[250706]: 2025-12-01 09:38:34.310 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:38:34 compute-0 ceph-mon[75031]: pgmap v841: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:34 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3455265482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:38:35 compute-0 nova_compute[250706]: 2025-12-01 09:38:35.306 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:35 compute-0 nova_compute[250706]: 2025-12-01 09:38:35.323 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:35 compute-0 nova_compute[250706]: 2025-12-01 09:38:35.323 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:38:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v842: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:36 compute-0 ceph-mon[75031]: pgmap v842: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v843: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:37 compute-0 podman[258688]: 2025-12-01 09:38:37.95741117 +0000 UTC m=+0.055197560 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 09:38:38 compute-0 ceph-mon[75031]: pgmap v843: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v844: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:40 compute-0 ceph-mon[75031]: pgmap v844: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v845: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:42 compute-0 ceph-mon[75031]: pgmap v845: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:38:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:38:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:38:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:38:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:38:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:38:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v846: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:38:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2472710661' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:38:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:38:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2472710661' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:38:44 compute-0 ceph-mon[75031]: pgmap v846: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2472710661' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:38:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2472710661' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:38:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #39. Immutable memtables: 0.
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.544361) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 39
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925544416, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1942, "num_deletes": 268, "total_data_size": 1984584, "memory_usage": 2022936, "flush_reason": "Manual Compaction"}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #40: started
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925554735, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 40, "file_size": 1392693, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15622, "largest_seqno": 17563, "table_properties": {"data_size": 1385258, "index_size": 4253, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18034, "raw_average_key_size": 21, "raw_value_size": 1369345, "raw_average_value_size": 1626, "num_data_blocks": 191, "num_entries": 842, "num_filter_entries": 842, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581783, "oldest_key_time": 1764581783, "file_creation_time": 1764581925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 40, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 10421 microseconds, and 5091 cpu microseconds.
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.554786) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #40: 1392693 bytes OK
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.554807) [db/memtable_list.cc:519] [default] Level-0 commit table #40 started
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.556438) [db/memtable_list.cc:722] [default] Level-0 commit table #40: memtable #1 done
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.556455) EVENT_LOG_v1 {"time_micros": 1764581925556449, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.556479) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 1976074, prev total WAL file size 1976074, number of live WAL files 2.
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000036.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.557357) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353031' seq:72057594037927935, type:22 .. '6D67727374617400373532' seq:0, type:0; will stop at (end)
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [40(1360KB)], [38(5489KB)]
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925557454, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [40], "files_L6": [38], "score": -1, "input_data_size": 7013851, "oldest_snapshot_seqno": -1}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #41: 3871 keys, 5465875 bytes, temperature: kUnknown
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925599548, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 41, "file_size": 5465875, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5438996, "index_size": 16082, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9733, "raw_key_size": 91208, "raw_average_key_size": 23, "raw_value_size": 5368411, "raw_average_value_size": 1386, "num_data_blocks": 695, "num_entries": 3871, "num_filter_entries": 3871, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.599812) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 5465875 bytes
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.600941) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.3 rd, 129.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 5.4 +0.0 blob) out(5.2 +0.0 blob), read-write-amplify(9.0) write-amplify(3.9) OK, records in: 4341, records dropped: 470 output_compression: NoCompression
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.600958) EVENT_LOG_v1 {"time_micros": 1764581925600949, "job": 18, "event": "compaction_finished", "compaction_time_micros": 42173, "compaction_time_cpu_micros": 18639, "output_level": 6, "num_output_files": 1, "total_output_size": 5465875, "num_input_records": 4341, "num_output_records": 3871, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000040.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925601313, "job": 18, "event": "table_file_deletion", "file_number": 40}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581925602485, "job": 18, "event": "table_file_deletion", "file_number": 38}
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.557164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:38:45 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:38:45.602651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:38:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v847: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:46 compute-0 ceph-mon[75031]: pgmap v847: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v848: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:48 compute-0 ceph-mon[75031]: pgmap v848: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v849: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:50 compute-0 ceph-mon[75031]: pgmap v849: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v850: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:52 compute-0 ceph-mon[75031]: pgmap v850: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v851: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:54 compute-0 ceph-mon[75031]: pgmap v851: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:54 compute-0 podman[258708]: 2025-12-01 09:38:54.98557644 +0000 UTC m=+0.075009269 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 01 09:38:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:38:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v852: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:56 compute-0 ceph-mon[75031]: pgmap v852: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v853: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:58 compute-0 ceph-mon[75031]: pgmap v853: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:38:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v854: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:00 compute-0 ceph-mon[75031]: pgmap v854: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v855: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:02 compute-0 ceph-mon[75031]: pgmap v855: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:03 compute-0 podman[258729]: 2025-12-01 09:39:03.001627874 +0000 UTC m=+0.102146140 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 01 09:39:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v856: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:04 compute-0 ceph-mon[75031]: pgmap v856: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v857: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:06 compute-0 ceph-mon[75031]: pgmap v857: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v858: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:08 compute-0 podman[258755]: 2025-12-01 09:39:08.969159342 +0000 UTC m=+0.063781057 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 09:39:09 compute-0 ceph-mon[75031]: pgmap v858: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v859: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:10 compute-0 sshd-session[258774]: Accepted publickey for zuul from 192.168.122.10 port 47554 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:39:10 compute-0 systemd-logind[788]: New session 52 of user zuul.
Dec 01 09:39:10 compute-0 systemd[1]: Started Session 52 of User zuul.
Dec 01 09:39:10 compute-0 sshd-session[258774]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:39:10 compute-0 sudo[258778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 01 09:39:10 compute-0 sudo[258778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:39:11 compute-0 ceph-mon[75031]: pgmap v859: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v860: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:39:13
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'images', 'volumes', 'vms', '.mgr']
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:39:13 compute-0 ceph-mon[75031]: pgmap v860: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14702 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v861: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:14 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14704 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0) v1
Dec 01 09:39:14 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/536856944' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:39:15 compute-0 ceph-mon[75031]: from='client.14702 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:15 compute-0 ceph-mon[75031]: pgmap v861: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:15 compute-0 ceph-mon[75031]: from='client.14704 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:15 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/536856944' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:39:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v862: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:17 compute-0 ceph-mon[75031]: pgmap v862: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v863: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:39:19 compute-0 ceph-mon[75031]: pgmap v863: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:19 compute-0 ovs-vsctl[259082]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 01 09:39:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v864: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:39:20.477 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:39:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:39:20.479 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:39:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:39:20.479 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:39:20 compute-0 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 01 09:39:20 compute-0 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 01 09:39:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:20 compute-0 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 01 09:39:21 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: cache status {prefix=cache status} (starting...)
Dec 01 09:39:21 compute-0 ceph-mon[75031]: pgmap v864: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:21 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: client ls {prefix=client ls} (starting...)
Dec 01 09:39:21 compute-0 lvm[259415]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 01 09:39:21 compute-0 lvm[259415]: VG ceph_vg2 finished
Dec 01 09:39:21 compute-0 lvm[259421]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 01 09:39:21 compute-0 lvm[259421]: VG ceph_vg1 finished
Dec 01 09:39:21 compute-0 lvm[259427]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:39:21 compute-0 lvm[259427]: VG ceph_vg0 finished
Dec 01 09:39:21 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14708 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v865: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:21 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: damage ls {prefix=damage ls} (starting...)
Dec 01 09:39:22 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14710 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:22 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump loads {prefix=dump loads} (starting...)
Dec 01 09:39:22 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 01 09:39:22 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 01 09:39:22 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 01 09:39:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "report"} v 0) v1
Dec 01 09:39:22 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1950735095' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 09:39:22 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 01 09:39:22 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14716 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:22 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:39:22.859+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:39:22 compute-0 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:39:22 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 01 09:39:22 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:39:22 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1596225968' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 01 09:39:23 compute-0 ceph-mon[75031]: from='client.14708 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mon[75031]: pgmap v865: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:23 compute-0 ceph-mon[75031]: from='client.14710 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1950735095' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1596225968' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Dec 01 09:39:23 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2498549434' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config log"} v 0) v1
Dec 01 09:39:23 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3136623468' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: ops {prefix=ops} (starting...)
Dec 01 09:39:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v866: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec 01 09:39:23 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/670763351' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:39:23 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Dec 01 09:39:23 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3654002977' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: session ls {prefix=session ls} (starting...)
Dec 01 09:39:24 compute-0 ceph-mon[75031]: from='client.14716 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2498549434' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3136623468' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/670763351' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3654002977' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: status {prefix=status} (starting...)
Dec 01 09:39:24 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14730 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec 01 09:39:24 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4087764241' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14734 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec 01 09:39:24 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1911139208' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:39:24 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec 01 09:39:24 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3965947463' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "features"} v 0) v1
Dec 01 09:39:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/404477119' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: pgmap v866: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:25 compute-0 ceph-mon[75031]: from='client.14730 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4087764241' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1911139208' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3965947463' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/404477119' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Dec 01 09:39:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/122866841' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Dec 01 09:39:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2845613920' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 09:39:25 compute-0 podman[259965]: 2025-12-01 09:39:25.53723352 +0000 UTC m=+0.104881639 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 01 09:39:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v867: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:25 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14744 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:25 compute-0 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 01 09:39:25 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:39:25.861+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 01 09:39:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec 01 09:39:25 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/428750214' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mon[75031]: from='client.14734 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/122866841' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2845613920' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/428750214' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Dec 01 09:39:26 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2199971143' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14754 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:26 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Dec 01 09:39:26 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3996741100' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14756 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mon[75031]: pgmap v867: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:27 compute-0 ceph-mon[75031]: from='client.14744 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2199971143' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mon[75031]: from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3996741100' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec 01 09:39:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1132193302' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14760 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec 01 09:39:27 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2888214778' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:39:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v868: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:56.208311+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 55238656 unmapped: 3424256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:57.208438+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 45 handle_osd_map epochs [46,46], i have 45, src has [1,46]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 45 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.891715050s of 14.028326988s, submitted: 204
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000128 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000034
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000233 1 0.000059
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000065 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000106 1 0.000040
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000093 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000032
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000182 1 0.000050
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000044 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000022
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000112 1 0.000042
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000083 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000019
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000130 1 0.000037
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000106 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000018
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000137 1 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000036 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000113 1 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000019
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000175 1 0.000044
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000030 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000026 1 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000155 1 0.000030
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000022 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000018
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000068 1 0.000037
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000036 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000075 1 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000023 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000014
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000064 1 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000046 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000018
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000078 1 0.000036
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000021 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000016
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000075 1 0.000032
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000018 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000012
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000052 1 0.000028
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000077 1 0.000047
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.927200 13 0.000080
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.936561 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.936641 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.936796 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073251724s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184906006s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] exit Reset 0.000064 1 0.000080
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.073219299s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184906006s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.901289 7 0.000281
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913195 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913496 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913534 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098457336s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210304260s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] exit Reset 0.000039 1 0.000070
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098435402s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210304260s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.896487 7 0.000073
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.909281 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.909484 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.909517 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103429794s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215400696s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] exit Reset 0.000041 1 0.000066
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.103402138s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215400696s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.927773 13 0.000115
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937353 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937427 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.937494 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072329521s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184417725s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] exit Reset 0.000032 1 0.000063
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072308540s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184417725s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.927897 13 0.000074
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937447 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937553 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.937689 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072682381s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184875488s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] exit Reset 0.000028 1 0.000051
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072663307s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184875488s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.928054 13 0.000084
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937620 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937725 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.937852 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072598457s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184898376s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] exit Reset 0.000032 1 0.000051
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.072576523s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184898376s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.928159 13 0.000106
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.937739 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.937989 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.938036 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071966171s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184402466s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] exit Reset 0.000028 1 0.000048
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071948051s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184402466s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902126 7 0.000057
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913748 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913867 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913890 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097693443s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210258484s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] exit Reset 0.000028 1 0.000054
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097675323s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210258484s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.928506 13 0.000065
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.938251 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.938347 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.938376 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071633339s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Reset 0.000040 1 0.000062
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.071617126s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902190 7 0.000240
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913823 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913879 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913893 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097608566s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210380554s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] exit Reset 0.000045 1 0.000048
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097588539s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210380554s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902343 7 0.000331
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.913857 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.913959 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.913983 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097500801s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210418701s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] exit Reset 0.000030 1 0.000050
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097480774s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210418701s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006040 2 0.000070
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000015 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005844 2 0.000062
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005320 2 0.000075
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005044 2 0.000043
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004706 2 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004729 2 0.000030
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004437 2 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005088 2 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004149 2 0.000055
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004079 2 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003848 2 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003674 2 0.000034
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003496 2 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004454 2 0.000034
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003915 2 0.000050
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931495 13 0.000100
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.941326 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905011 7 0.000061
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916481 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916656 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916725 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931678 13 0.000097
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.941587 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905011 7 0.000053
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905101 7 0.000075
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.941679 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916535 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916624 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916663 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916697 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.941501 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916744 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.941630 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916761 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068525314s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184318542s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094830513s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210655212s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Reset 0.000091 1 0.000226
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] exit Reset 0.000098 1 0.000147
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094977379s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210678101s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068471909s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184318542s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094768524s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210685730s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] exit Reset 0.000259 1 0.000292
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] exit Reset 0.000193 1 0.000236
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094736099s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210678101s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094737053s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210685730s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905373 7 0.000314
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931759 13 0.000080
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.942760 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916500 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.942846 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916655 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.942873 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916761 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.931929 13 0.000128
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.942272 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.942359 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.942388 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.941706 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] exit Start 0.000249 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067185402s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183311462s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094796181s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210655212s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] exit Reset 0.000044 1 0.000066
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067219734s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183364868s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067164421s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183311462s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] exit Reset 0.000046 1 0.000480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094496727s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210739136s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905435 7 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] exit Reset 0.000226 1 0.000256
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094424248s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210739136s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.932575 13 0.000084
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.943519 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.943593 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.943617 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065610886s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181999207s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] exit Reset 0.000041 1 0.000062
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065589905s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181999207s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.932457 13 0.000106
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.942469 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.942727 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.942780 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916490 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916803 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916832 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094371796s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210922241s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905589 7 0.000045
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916482 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] exit Reset 0.000042 1 0.000293
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916589 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066887856s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183380127s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094347954s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210922241s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] exit Reset 0.000163 1 0.000184
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066744804s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183380127s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.932766 13 0.000112
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.943042 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944461 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.944511 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066463470s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.183166504s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068760872s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.184867859s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.905785 7 0.000060
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916663 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.916781 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916818 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] exit Reset 0.000684 1 0.000706
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094186783s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210975647s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.068104744s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.184867859s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] exit Reset 0.000040 1 0.000062
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.094164848s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210975647s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.933300 13 0.000347
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.944706 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944774 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.944797 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064965248s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181869507s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] exit Reset 0.000049 1 0.000052
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064947128s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181869507s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] exit Reset 0.000279 1 0.000299
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.066201210s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183166504s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.906035 7 0.000063
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916829 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.917056 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.917080 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.933197 13 0.000086
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.943520 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093912125s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210968018s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.067193031s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.183364868s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944857 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945145 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065909386s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.182998657s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] exit Reset 0.000051 1 0.000072
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] exit Reset 0.000037 1 0.000055
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093876839s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210968018s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.906228 7 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.065887451s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.182998657s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.933440 13 0.000095
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.944582 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.944739 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945353 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902068 7 0.000032
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.934205 13 0.000113
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.945186 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.945369 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945427 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064443588s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181739807s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] exit Reset 0.000069 1 0.000078
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064415932s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181739807s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064523697s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181938171s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] exit Reset 0.000210 1 0.000278
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] exit Start 0.000016 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.064498901s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181938171s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902227 7 0.000072
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915218 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915428 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.916702 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915500 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.934578 13 0.000073
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.916974 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093406677s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210983276s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.945615 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.917499 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.945746 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.945836 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] exit Reset 0.000050 1 0.001111
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.917531 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063798904s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181411743s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093382835s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210983276s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] exit Reset 0.000042 1 0.000070
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093309402s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.210937500s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063775063s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181411743s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] exit Reset 0.000050 1 0.000527
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915215 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.935918 13 0.000126
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.945905 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.946068 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.946138 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] exit Start 0.000106 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.093281746s) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.210937500s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.936706 13 0.000092
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063556671s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181381226s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946681 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.946855 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.946886 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] exit Reset 0.000112 1 0.000129
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.917584 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.917639 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063532829s) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181381226s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063310623s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181175232s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902640 7 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915777 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097242355s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215141296s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] exit Reset 0.000058 1 0.000107
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915861 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915895 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063269615s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181175232s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] exit Reset 0.000045 1 0.000688
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097218513s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215141296s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097195625s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215148926s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] exit Reset 0.000060 1 0.000082
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.936189 13 0.000135
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.942584 13 0.000151
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097170830s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215148926s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946416 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946739 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.947109 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902742 7 0.000068
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.946948 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.947146 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915546 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915715 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915758 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097122192s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Reset 0.000047 1 0.000099
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Start 0.000023 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097093582s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902808 7 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915626 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915768 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915805 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097187996s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.935135 13 0.000118
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.946832 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.947337 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.947380 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Reset 0.000064 1 0.000096
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Start 0.000015 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097146034s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902914 7 0.000043
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.915607 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.915855 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.915895 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=41) [2] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097031593s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215385437s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Reset 0.000047 1 0.000059
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.947032 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097012520s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215385437s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057223320s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.175636292s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] exit Reset 0.000052 1 0.000440
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] exit Start 0.000018 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.057195663s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.175636292s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=39) [2] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096585274s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 78.215202332s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063093185s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181732178s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Reset 0.001166 1 0.001257
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096489906s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.215202332s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] exit Reset 0.000589 1 0.000617
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007565 2 0.000024
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062900543s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181732178s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.063192368s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 76.181236267s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] exit Reset 0.000900 1 0.000919
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.062316895s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 76.181236267s@ mbc={}] enter Started/Stray
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 46 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000130 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000036
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000013 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000158 1 0.000063
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000026 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000069 1 0.000036
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000056 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000020
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000114 1 0.000037
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000020
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000083 1 0.000056
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000038 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000082 1 0.000030
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000057 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000024
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000033 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000103 1 0.000065
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000144 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000038
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000067 1 0.000045
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000051 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000056 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000018
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000103 1 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000094 1 0.000030
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000043 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000080 1 0.000028
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000040 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000016
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000122 1 0.000049
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000019
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000108 1 0.000062
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000050 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000012
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000057 1 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000035 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000150 1 0.000031
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000069 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000017
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000094 1 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000105
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000075 1 0.000032
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000040 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000036
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000088 1 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000012
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000063 1 0.000029
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007334 2 0.000044
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007772 2 0.000056
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006756 2 0.000044
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006317 2 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005922 2 0.000037
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006610 2 0.000032
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005573 2 0.000034
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005349 2 0.000022
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004919 2 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004597 2 0.000028
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003814 2 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003272 2 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003008 2 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003657 2 0.000027
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002087 2 0.000047
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004284 2 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002499 2 0.000024
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004733 2 0.000031
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002034 2 0.000051
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e(unlocked)] enter Initial
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000018
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000082 1 0.000040
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:27 compute-0 ceph-osd[90166]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000381 2 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56541184 unmapped: 2121728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:58.208762+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 46 handle_osd_map epochs [46,47], i have 46, src has [1,47]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 46 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075068 2 0.000056
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081146 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075117 2 0.000068
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081575 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075086 2 0.000051
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081837 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075547 2 0.000042
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.082455 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075663 2 0.000043
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.083633 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075773 2 0.000057
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.083223 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.111282 2 0.000022
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.115318 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.107981 2 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.115635 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.112342 2 0.000023
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.115950 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.112443 2 0.000021
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.116227 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075826 2 0.000029
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081508 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075818 2 0.000038
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080882 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075917 2 0.000027
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081343 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075925 2 0.000031
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080649 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075549 2 0.000024
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080394 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075644 2 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.080082 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075847 2 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079807 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075821 2 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079573 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075908 2 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079364 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113150 2 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.117095 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113265 2 0.000030
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.117452 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113864 2 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.118209 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.113461 2 0.000030
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.118026 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.114150 2 0.000043
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.119035 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.114102 2 0.000031
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.118750 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.114205 2 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.119468 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076687 2 0.000045
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079852 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115195 2 0.000031
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.120072 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.073550 2 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.074052 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115579 2 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.120785 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077176 2 0.000028
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079778 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115798 2 0.000032
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.121333 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.115927 2 0.000034
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.121906 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.116092 2 0.000062
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.122422 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077217 2 0.000030
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079347 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077516 2 0.000033
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.079728 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007370 4 0.000160
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.11( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007457 4 0.000140
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.15( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007439 4 0.000289
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.11( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008189 4 0.000091
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018105 4 0.000098
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018194 4 0.000073
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017876 4 0.000060
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018159 4 0.000293
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018094 4 0.000049
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018756 4 0.000059
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018734 4 0.000052
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018690 4 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018667 4 0.000040
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.5( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018633 4 0.000038
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018556 4 0.000038
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018509 4 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.019009 4 0.000053
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.11( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018700 4 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018740 4 0.000056
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018873 4 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000024 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018347 4 0.000054
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018380 4 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018391 4 0.000080
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018267 4 0.000043
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018661 4 0.000355
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018917 4 0.000435
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018369 4 0.000056
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018497 4 0.000856
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018039 4 0.000076
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000015 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017968 4 0.000048
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.c( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018036 4 0.000092
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017902 4 0.000044
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017853 4 0.000039
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017807 4 0.000064
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [2] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017805 4 0.000050
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017746 4 0.000051
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [2] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000544 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[7.1a( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [2] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.140052 7 0.000054
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139802 7 0.000110
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139496 7 0.000070
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.140735 7 0.000123
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000129 1 0.000067
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139764 7 0.000069
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000155 1 0.000032
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.139899 7 0.000056
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000292 1 0.000021
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000414 1 0.000086
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000273 1 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000299 1 0.000036
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.140516 7 0.000919
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000077 1 0.000043
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: unregistering
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146425 7 0.000106
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146103 7 0.000085
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145670 7 0.000499
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144601 7 0.000080
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000074 1 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000176 1 0.000016
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000213 1 0.000015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000240 1 0.000014
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145949 7 0.000067
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145822 7 0.000148
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146021 7 0.000126
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146573 7 0.000080
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145554 7 0.000077
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145089 7 0.000068
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145401 7 0.000061
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.145081 7 0.000093
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144349 7 0.000091
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144976 7 0.000124
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144869 7 0.000099
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.144762 7 0.000080
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151294 7 0.000069
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000243 1 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000265 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000360 1 0.000012
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000556 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000610 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000649 1 0.000016
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000675 1 0.000016
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001032 1 0.000018
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001162 1 0.000017
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001198 1 0.000015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001225 1 0.000012
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001259 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001274 1 0.000022
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151282 7 0.000075
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151377 7 0.000076
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.154835 7 0.000071
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.150218 7 0.000055
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149424 7 0.000054
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149584 7 0.000196
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149767 7 0.000118
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000133 1 0.000057
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000188 1 0.000017
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151421 7 0.000299
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000219 1 0.000016
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000240 1 0.000015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.150994 7 0.000064
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151490 7 0.000115
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149212 7 0.000086
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151109 7 0.000070
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.151119 7 0.000069
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.155768 7 0.000047
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.155903 7 0.000044
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.156051 7 0.000056
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149122 7 0.000093
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.150711 7 0.000098
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000726 1 0.000014
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.149342 7 0.000078
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000835 1 0.000014
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000661 1 0.000015
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000708 1 0.000012
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001152 1 0.000011
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001359 1 0.000435
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000980 1 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001057 1 0.000027
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001529 1 0.000350
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001246 1 0.000014
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001466 1 0.000023
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001580 1 0.000062
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001749 1 0.000240
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001527 1 0.000636
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002555 1 0.001843
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.029881 1 0.000052
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.030055 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.170150 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.034098 1 0.000057
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.034279 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.173809 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.041486 1 0.000034
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.041824 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.182592 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.048864 1 0.000028
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.049336 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.189180 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.056115 1 0.000019
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.056430 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.196228 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.063735 1 0.000038
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.064086 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.204024 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.067404 1 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.067541 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.208131 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.068841 1 0.000065
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.068950 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.16( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.215421 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.076196 1 0.000023
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.076410 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.222547 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.083285 1 0.000022
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.083525 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.229223 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.090615 1 0.000013
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.090893 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.235530 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.098030 1 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.098311 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.244293 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.104882 1 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.105194 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.251065 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.112211 1 0.000022
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.112598 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.258646 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.119568 1 0.000029
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.120158 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.266769 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.127023 1 0.000021
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.127678 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.273270 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56877056 unmapped: 1785856 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.134283 1 0.000026
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.134963 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.280081 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.141738 1 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.142479 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.287929 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.148539 1 0.000028
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.149606 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.c( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.294725 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.156023 1 0.000031
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.157227 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.301617 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.163234 1 0.000019
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.164462 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.309482 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.170455 1 0.000025
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.171714 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.19( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.316617 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.178009 1 0.000023
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.179301 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.324087 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.185443 1 0.000021
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.186761 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1d( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.338090 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.189753 1 0.000050
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.189918 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.341246 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.197090 1 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.197323 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.352186 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.204359 1 0.000024
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.204620 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.3( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.354882 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.211785 1 0.000028
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.212056 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.b( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.361506 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.218549 1 0.000041
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.219320 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.369043 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.225795 1 0.000063
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.226664 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.376468 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.233286 1 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.233984 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.385005 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.240612 1 0.000382
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.241351 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.392874 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.247489 1 0.000049
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.248678 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.397935 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.254868 1 0.000036
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.256263 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.15( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.407966 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.262424 1 0.000044
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.263453 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.419245 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.269418 1 0.000038
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.270982 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.422128 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.277107 1 0.000122
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.278244 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.19( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.434202 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.284107 1 0.000034
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.285407 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.1e( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.441502 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.291503 1 0.000077
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.293041 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.443788 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.298705 1 0.000035
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.300337 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.449499 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.306096 1 0.000027
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.307889 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[5.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.459064 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.313598 1 0.000023
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.315171 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.464571 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.320802 1 0.000020
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.323399 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 pg_epoch: 47 pg[2.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.474831 0 0.000000
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:59.208940+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 9 sent 7 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:28.427278+0000 osd.2 (osd.2) 8 : cluster [DBG] 2.c deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:28.441378+0000 osd.2 (osd.2) 9 : cluster [DBG] 2.c deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 323898 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56909824 unmapped: 1753088 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 9) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:28.427278+0000 osd.2 (osd.2) 8 : cluster [DBG] 2.c deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:28.441378+0000 osd.2 (osd.2) 9 : cluster [DBG] 2.c deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:00.209164+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56975360 unmapped: 1687552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:01.209342+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 11 sent 9 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:30.419231+0000 osd.2 (osd.2) 10 : cluster [DBG] 2.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:30.433349+0000 osd.2 (osd.2) 11 : cluster [DBG] 2.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d6fa8c00
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57139200 unmapped: 1523712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 11) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:30.419231+0000 osd.2 (osd.2) 10 : cluster [DBG] 2.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:30.433349+0000 osd.2 (osd.2) 11 : cluster [DBG] 2.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:02.209566+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57171968 unmapped: 1490944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:03.209726+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57171968 unmapped: 1490944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:04.209873+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 343379 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57180160 unmapped: 1482752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:05.210456+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.10 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56983552 unmapped: 1679360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:06.210701+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 13 sent 11 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:35.418556+0000 osd.2 (osd.2) 12 : cluster [DBG] 2.10 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:35.432612+0000 osd.2 (osd.2) 13 : cluster [DBG] 2.10 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56983552 unmapped: 1679360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 13) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:35.418556+0000 osd.2 (osd.2) 12 : cluster [DBG] 2.10 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:35.432612+0000 osd.2 (osd.2) 13 : cluster [DBG] 2.10 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:07.210954+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.735406876s of 10.144852638s, submitted: 376
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.12 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56983552 unmapped: 1679360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:08.211144+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 15 sent 13 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:37.413039+0000 osd.2 (osd.2) 14 : cluster [DBG] 2.12 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:37.427101+0000 osd.2 (osd.2) 15 : cluster [DBG] 2.12 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56934400 unmapped: 1728512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 15) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:37.413039+0000 osd.2 (osd.2) 14 : cluster [DBG] 2.12 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:37.427101+0000 osd.2 (osd.2) 15 : cluster [DBG] 2.12 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:09.211342+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 17 sent 15 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:38.431619+0000 osd.2 (osd.2) 16 : cluster [DBG] 2.14 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:38.445720+0000 osd.2 (osd.2) 17 : cluster [DBG] 2.14 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 345591 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56934400 unmapped: 1728512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 17) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:38.431619+0000 osd.2 (osd.2) 16 : cluster [DBG] 2.14 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:38.445720+0000 osd.2 (osd.2) 17 : cluster [DBG] 2.14 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:10.211635+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56934400 unmapped: 1728512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:11.211890+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56942592 unmapped: 1720320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:12.212040+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56942592 unmapped: 1720320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:13.212180+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 19 sent 17 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:42.390712+0000 osd.2 (osd.2) 18 : cluster [DBG] 2.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:42.404798+0000 osd.2 (osd.2) 19 : cluster [DBG] 2.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 56950784 unmapped: 1712128 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 19) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:42.390712+0000 osd.2 (osd.2) 18 : cluster [DBG] 2.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:42.404798+0000 osd.2 (osd.2) 19 : cluster [DBG] 2.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:14.212450+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 347887 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57040896 unmapped: 1622016 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:15.212636+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 21 sent 19 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:44.394219+0000 osd.2 (osd.2) 20 : cluster [DBG] 2.1e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:44.408278+0000 osd.2 (osd.2) 21 : cluster [DBG] 2.1e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57065472 unmapped: 1597440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 21) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:44.394219+0000 osd.2 (osd.2) 20 : cluster [DBG] 2.1e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:44.408278+0000 osd.2 (osd.2) 21 : cluster [DBG] 2.1e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:16.212871+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57065472 unmapped: 1597440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:17.213071+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.991518974s of 10.020858765s, submitted: 8
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57073664 unmapped: 1589248 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:18.213361+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 23 sent 21 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:47.433915+0000 osd.2 (osd.2) 22 : cluster [DBG] 5.6 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:47.448022+0000 osd.2 (osd.2) 23 : cluster [DBG] 5.6 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57073664 unmapped: 1589248 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 23) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:47.433915+0000 osd.2 (osd.2) 22 : cluster [DBG] 5.6 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:47.448022+0000 osd.2 (osd.2) 23 : cluster [DBG] 5.6 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:19.213587+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 350181 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57114624 unmapped: 1548288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:20.213769+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 25 sent 23 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:49.462844+0000 osd.2 (osd.2) 24 : cluster [DBG] 5.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:49.476977+0000 osd.2 (osd.2) 25 : cluster [DBG] 5.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57122816 unmapped: 1540096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 25) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:49.462844+0000 osd.2 (osd.2) 24 : cluster [DBG] 5.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:49.476977+0000 osd.2 (osd.2) 25 : cluster [DBG] 5.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:21.213971+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 27 sent 25 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:50.501098+0000 osd.2 (osd.2) 26 : cluster [DBG] 5.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:50.515116+0000 osd.2 (osd.2) 27 : cluster [DBG] 5.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57131008 unmapped: 1531904 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 27) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:50.501098+0000 osd.2 (osd.2) 26 : cluster [DBG] 5.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:50.515116+0000 osd.2 (osd.2) 27 : cluster [DBG] 5.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:22.214256+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 29 sent 27 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:51.486279+0000 osd.2 (osd.2) 28 : cluster [DBG] 5.b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:51.500443+0000 osd.2 (osd.2) 29 : cluster [DBG] 5.b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57139200 unmapped: 1523712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 29) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:51.486279+0000 osd.2 (osd.2) 28 : cluster [DBG] 5.b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:51.500443+0000 osd.2 (osd.2) 29 : cluster [DBG] 5.b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:23.214520+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57139200 unmapped: 1523712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:24.214739+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 353622 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57171968 unmapped: 1490944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:25.214931+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 31 sent 29 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:54.461604+0000 osd.2 (osd.2) 30 : cluster [DBG] 5.d scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:54.475716+0000 osd.2 (osd.2) 31 : cluster [DBG] 5.d scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57196544 unmapped: 1466368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 31) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:54.461604+0000 osd.2 (osd.2) 30 : cluster [DBG] 5.d scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:54.475716+0000 osd.2 (osd.2) 31 : cluster [DBG] 5.d scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:26.215331+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57196544 unmapped: 1466368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:27.215503+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 33 sent 31 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:56.430626+0000 osd.2 (osd.2) 32 : cluster [DBG] 5.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:15:56.444661+0000 osd.2 (osd.2) 33 : cluster [DBG] 5.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57196544 unmapped: 1466368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 33) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:56.430626+0000 osd.2 (osd.2) 32 : cluster [DBG] 5.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:15:56.444661+0000 osd.2 (osd.2) 33 : cluster [DBG] 5.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:28.215742+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57204736 unmapped: 1458176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:29.215914+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 354769 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57204736 unmapped: 1458176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:30.216060+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57221120 unmapped: 1441792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:31.216213+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57221120 unmapped: 1441792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:32.216380+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57221120 unmapped: 1441792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:33.216547+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57229312 unmapped: 1433600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:34.216686+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.091085434s of 17.134098053s, submitted: 12
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 355917 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.10 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57253888 unmapped: 1409024 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:35.216852+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 35 sent 33 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:04.568279+0000 osd.2 (osd.2) 34 : cluster [DBG] 5.10 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:04.582227+0000 osd.2 (osd.2) 35 : cluster [DBG] 5.10 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57262080 unmapped: 1400832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 35) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:04.568279+0000 osd.2 (osd.2) 34 : cluster [DBG] 5.10 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:04.582227+0000 osd.2 (osd.2) 35 : cluster [DBG] 5.10 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:36.217188+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57262080 unmapped: 1400832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:37.217387+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 37 sent 35 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:06.542063+0000 osd.2 (osd.2) 36 : cluster [DBG] 5.17 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:06.556053+0000 osd.2 (osd.2) 37 : cluster [DBG] 5.17 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57270272 unmapped: 1392640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 37) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:06.542063+0000 osd.2 (osd.2) 36 : cluster [DBG] 5.17 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:06.556053+0000 osd.2 (osd.2) 37 : cluster [DBG] 5.17 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:38.217630+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57278464 unmapped: 1384448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:39.217814+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 358213 data_alloc: 218103808 data_used: 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57286656 unmapped: 1376256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:40.217993+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 39 sent 37 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:09.529927+0000 osd.2 (osd.2) 38 : cluster [DBG] 5.1b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:09.544043+0000 osd.2 (osd.2) 39 : cluster [DBG] 5.1b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57286656 unmapped: 1376256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:41.218357+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 39) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:09.529927+0000 osd.2 (osd.2) 38 : cluster [DBG] 5.1b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:09.544043+0000 osd.2 (osd.2) 39 : cluster [DBG] 5.1b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57286656 unmapped: 1376256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:42.218525+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57294848 unmapped: 1368064 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:43.218709+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57303040 unmapped: 1359872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:44.218861+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 359013 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57303040 unmapped: 1359872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:45.219029+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57311232 unmapped: 1351680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:46.219207+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.958663940s of 11.982688904s, submitted: 6
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57319424 unmapped: 1343488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:47.219373+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 41 sent 39 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:16.550889+0000 osd.2 (osd.2) 40 : cluster [DBG] 5.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:16.564973+0000 osd.2 (osd.2) 41 : cluster [DBG] 5.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 41) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:16.550889+0000 osd.2 (osd.2) 40 : cluster [DBG] 5.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:16.564973+0000 osd.2 (osd.2) 41 : cluster [DBG] 5.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:48.219822+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 43 sent 41 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:17.588748+0000 osd.2 (osd.2) 42 : cluster [DBG] 5.1f scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:17.602848+0000 osd.2 (osd.2) 43 : cluster [DBG] 5.1f scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 43) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:17.588748+0000 osd.2 (osd.2) 42 : cluster [DBG] 5.1f scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:17.602848+0000 osd.2 (osd.2) 43 : cluster [DBG] 5.1f scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:49.220014+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361309 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:50.220174+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:51.220388+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 45 sent 43 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:20.603951+0000 osd.2 (osd.2) 44 : cluster [DBG] 4.18 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:20.617993+0000 osd.2 (osd.2) 45 : cluster [DBG] 4.18 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 45) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:20.603951+0000 osd.2 (osd.2) 44 : cluster [DBG] 4.18 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:20.617993+0000 osd.2 (osd.2) 45 : cluster [DBG] 4.18 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:52.220657+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:53.220860+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 47 sent 45 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:22.648231+0000 osd.2 (osd.2) 46 : cluster [DBG] 4.1b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:22.662347+0000 osd.2 (osd.2) 47 : cluster [DBG] 4.1b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 47) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:22.648231+0000 osd.2 (osd.2) 46 : cluster [DBG] 4.1b scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:22.662347+0000 osd.2 (osd.2) 47 : cluster [DBG] 4.1b scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:54.221102+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 363605 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:55.221321+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 49 sent 47 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:24.669734+0000 osd.2 (osd.2) 48 : cluster [DBG] 4.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:24.683799+0000 osd.2 (osd.2) 49 : cluster [DBG] 4.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 49) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:24.669734+0000 osd.2 (osd.2) 48 : cluster [DBG] 4.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:24.683799+0000 osd.2 (osd.2) 49 : cluster [DBG] 4.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:56.221654+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.919653893s of 10.155382156s, submitted: 10
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:57.221876+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 51 sent 49 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:26.706269+0000 osd.2 (osd.2) 50 : cluster [DBG] 6.f scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:26.723859+0000 osd.2 (osd.2) 51 : cluster [DBG] 6.f scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:58.222129+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 51) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:26.706269+0000 osd.2 (osd.2) 50 : cluster [DBG] 6.f scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:26.723859+0000 osd.2 (osd.2) 51 : cluster [DBG] 6.f scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:59.222273+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 365900 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:00.222431+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:01.222587+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:02.222790+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:03.222963+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:04.223273+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:33.790932+0000 osd.2 (osd.2) 52 : cluster [DBG] 4.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:33.805014+0000 osd.2 (osd.2) 53 : cluster [DBG] 4.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 53) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:33.790932+0000 osd.2 (osd.2) 52 : cluster [DBG] 4.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:33.805014+0000 osd.2 (osd.2) 53 : cluster [DBG] 4.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:05.223668+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:06.223924+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:07.224093+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:08.224528+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:09.224696+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:10.224873+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:11.225036+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57425920 unmapped: 1236992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:12.225212+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:13.225349+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:14.225489+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:15.225646+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:16.225790+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:17.225967+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.934640884s of 20.948490143s, submitted: 4
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57450496 unmapped: 1212416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:18.226165+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:47.654642+0000 osd.2 (osd.2) 54 : cluster [DBG] 4.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:47.668603+0000 osd.2 (osd.2) 55 : cluster [DBG] 4.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 55) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:47.654642+0000 osd.2 (osd.2) 54 : cluster [DBG] 4.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:47.668603+0000 osd.2 (osd.2) 55 : cluster [DBG] 4.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:19.226796+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368194 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:20.226924+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:21.227082+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:22.227220+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57483264 unmapped: 1179648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:23.227352+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:52.628412+0000 osd.2 (osd.2) 56 : cluster [DBG] 6.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:52.642469+0000 osd.2 (osd.2) 57 : cluster [DBG] 6.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57491456 unmapped: 1171456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:24.227556+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 4 last_log 59 sent 57 num 4 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:53.669273+0000 osd.2 (osd.2) 58 : cluster [DBG] 6.14 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:53.697496+0000 osd.2 (osd.2) 59 : cluster [DBG] 6.14 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 57) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:52.628412+0000 osd.2 (osd.2) 56 : cluster [DBG] 6.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:52.642469+0000 osd.2 (osd.2) 57 : cluster [DBG] 6.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 59) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:53.669273+0000 osd.2 (osd.2) 58 : cluster [DBG] 6.14 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:53.697496+0000 osd.2 (osd.2) 59 : cluster [DBG] 6.14 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 371637 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57499648 unmapped: 1163264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:25.227748+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:54.623675+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.13 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:54.637735+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.13 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 61) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:54.623675+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.13 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:54.637735+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.13 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57507840 unmapped: 1155072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:26.228031+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:55.581391+0000 osd.2 (osd.2) 62 : cluster [DBG] 6.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:55.595457+0000 osd.2 (osd.2) 63 : cluster [DBG] 6.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 63) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:55.581391+0000 osd.2 (osd.2) 62 : cluster [DBG] 6.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:55.595457+0000 osd.2 (osd.2) 63 : cluster [DBG] 6.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:27.228333+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:28.228523+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:29.228704+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.327646255s of 11.902298927s, submitted: 10
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373933 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57532416 unmapped: 1130496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:30.228871+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:59.557143+0000 osd.2 (osd.2) 64 : cluster [DBG] 4.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:59.571187+0000 osd.2 (osd.2) 65 : cluster [DBG] 4.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 65) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:59.557143+0000 osd.2 (osd.2) 64 : cluster [DBG] 4.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:59.571187+0000 osd.2 (osd.2) 65 : cluster [DBG] 4.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:31.229251+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:32.229527+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:01.574587+0000 osd.2 (osd.2) 66 : cluster [DBG] 6.13 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:01.595683+0000 osd.2 (osd.2) 67 : cluster [DBG] 6.13 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 67) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:01.574587+0000 osd.2 (osd.2) 66 : cluster [DBG] 6.13 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:01.595683+0000 osd.2 (osd.2) 67 : cluster [DBG] 6.13 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:33.229782+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:02.559596+0000 osd.2 (osd.2) 68 : cluster [DBG] 6.15 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:02.580464+0000 osd.2 (osd.2) 69 : cluster [DBG] 6.15 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 69) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:02.559596+0000 osd.2 (osd.2) 68 : cluster [DBG] 6.15 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:02.580464+0000 osd.2 (osd.2) 69 : cluster [DBG] 6.15 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:34.230033+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:03.582987+0000 osd.2 (osd.2) 70 : cluster [DBG] 6.1f deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:03.604146+0000 osd.2 (osd.2) 71 : cluster [DBG] 6.1f deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 377377 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 71) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:03.582987+0000 osd.2 (osd.2) 70 : cluster [DBG] 6.1f deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:03.604146+0000 osd.2 (osd.2) 71 : cluster [DBG] 6.1f deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:35.230254+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:36.230354+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:37.230503+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:38.230737+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:07.589006+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:07.603094+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 73) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:07.589006+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:07.603094+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:39.231019+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 378525 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57540608 unmapped: 1122304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:40.231207+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:41.231432+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:42.231635+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:43.231789+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:44.231967+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.905238152s of 15.089574814s, submitted: 10
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 379673 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:45.232124+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:14.646727+0000 osd.2 (osd.2) 74 : cluster [DBG] 3.18 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:14.660813+0000 osd.2 (osd.2) 75 : cluster [DBG] 3.18 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 75) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:14.646727+0000 osd.2 (osd.2) 74 : cluster [DBG] 3.18 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:14.660813+0000 osd.2 (osd.2) 75 : cluster [DBG] 3.18 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:46.232369+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:15.630119+0000 osd.2 (osd.2) 76 : cluster [DBG] 7.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:15.644206+0000 osd.2 (osd.2) 77 : cluster [DBG] 7.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 77) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:15.630119+0000 osd.2 (osd.2) 76 : cluster [DBG] 7.1c scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:15.644206+0000 osd.2 (osd.2) 77 : cluster [DBG] 7.1c scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:47.232569+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:48.232806+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:49.233013+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 380821 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57606144 unmapped: 1056768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:50.233185+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57614336 unmapped: 1048576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:51.233384+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:20.630324+0000 osd.2 (osd.2) 78 : cluster [DBG] 3.16 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:20.644341+0000 osd.2 (osd.2) 79 : cluster [DBG] 3.16 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 79) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:20.630324+0000 osd.2 (osd.2) 78 : cluster [DBG] 3.16 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:20.644341+0000 osd.2 (osd.2) 79 : cluster [DBG] 3.16 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:52.233620+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:53.233817+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:54.233987+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 381969 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:55.234139+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:56.234380+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:57.234568+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:58.234887+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.969743729s of 13.996441841s, submitted: 6
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:59.235195+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:28.643193+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:28.657255+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383117 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 81) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:28.643193+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:28.657255+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:00.235541+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:01.235745+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57663488 unmapped: 999424 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:02.235892+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57671680 unmapped: 991232 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:03.236073+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57679872 unmapped: 983040 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:04.236248+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:33.523155+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.15 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:33.537185+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.15 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 384265 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 83) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:33.523155+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.15 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:33.537185+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.15 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:05.236533+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:06.236729+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:07.236900+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:36.449907+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:36.464074+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 85) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:36.449907+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.11 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:36.464074+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.11 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:08.237237+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:09.237381+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:38.488088+0000 osd.2 (osd.2) 86 : cluster [DBG] 3.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:38.502194+0000 osd.2 (osd.2) 87 : cluster [DBG] 3.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386560 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 87) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:38.488088+0000 osd.2 (osd.2) 86 : cluster [DBG] 3.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:38.502194+0000 osd.2 (osd.2) 87 : cluster [DBG] 3.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:10.237574+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57712640 unmapped: 950272 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:11.237731+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57720832 unmapped: 942080 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:12.237902+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.829962730s of 13.884933472s, submitted: 8
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57745408 unmapped: 917504 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:13.238091+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:42.528237+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:42.542392+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 89) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:42.528237+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:42.542392+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:14.238344+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 387707 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:15.238498+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:16.238665+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:17.238817+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:18.239003+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:19.239186+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388854 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:20.239367+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:49.596230+0000 osd.2 (osd.2) 90 : cluster [DBG] 7.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:49.610164+0000 osd.2 (osd.2) 91 : cluster [DBG] 7.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 91) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:49.596230+0000 osd.2 (osd.2) 90 : cluster [DBG] 7.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:49.610164+0000 osd.2 (osd.2) 91 : cluster [DBG] 7.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:21.239838+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57786368 unmapped: 876544 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:22.240031+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.005264282s of 10.019852638s, submitted: 4
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:23.240166+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:52.548059+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.5 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:52.562164+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.5 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 93) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:52.548059+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.5 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:52.562164+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.5 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:24.240337+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 390001 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:25.240531+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:26.240730+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57819136 unmapped: 843776 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:27.240954+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:56.516094+0000 osd.2 (osd.2) 94 : cluster [DBG] 7.2 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:56.530250+0000 osd.2 (osd.2) 95 : cluster [DBG] 7.2 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 95) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:56.516094+0000 osd.2 (osd.2) 94 : cluster [DBG] 7.2 deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:56.530250+0000 osd.2 (osd.2) 95 : cluster [DBG] 7.2 deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57827328 unmapped: 835584 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:28.241601+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:29.241817+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:58.460358+0000 osd.2 (osd.2) 96 : cluster [DBG] 3.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:58.474342+0000 osd.2 (osd.2) 97 : cluster [DBG] 3.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 97) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:58.460358+0000 osd.2 (osd.2) 96 : cluster [DBG] 3.8 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:58.474342+0000 osd.2 (osd.2) 97 : cluster [DBG] 3.8 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 393442 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:30.242180+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:59.466738+0000 osd.2 (osd.2) 98 : cluster [DBG] 7.c deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:59.480846+0000 osd.2 (osd.2) 99 : cluster [DBG] 7.c deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 99) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:59.466738+0000 osd.2 (osd.2) 98 : cluster [DBG] 7.c deep-scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:59.480846+0000 osd.2 (osd.2) 99 : cluster [DBG] 7.c deep-scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:31.242584+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:00.429138+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.7 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:00.443057+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.7 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 101) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:00.429138+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.7 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:00.443057+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.7 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:32.242834+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:33.242973+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57860096 unmapped: 802816 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:34.243597+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 394589 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:35.243764+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:36.244002+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:37.244331+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:38.244751+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.707541466s of 15.794960022s, submitted: 10
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:39.245451+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:08.343347+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:08.357321+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 103) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:08.343347+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.1a scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:08.357321+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.1a scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 395737 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:40.245886+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:41.246007+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57892864 unmapped: 770048 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:42.246167+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57909248 unmapped: 753664 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:43.246310+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:12.300849+0000 osd.2 (osd.2) 104 : cluster [DBG] 7.1 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:12.314935+0000 osd.2 (osd.2) 105 : cluster [DBG] 7.1 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 105) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:12.300849+0000 osd.2 (osd.2) 104 : cluster [DBG] 7.1 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:12.314935+0000 osd.2 (osd.2) 105 : cluster [DBG] 7.1 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:44.246625+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:13.349355+0000 osd.2 (osd.2) 106 : cluster [DBG] 3.1d scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:13.363458+0000 osd.2 (osd.2) 107 : cluster [DBG] 3.1d scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398032 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 107) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:13.349355+0000 osd.2 (osd.2) 106 : cluster [DBG] 3.1d scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:13.363458+0000 osd.2 (osd.2) 107 : cluster [DBG] 3.1d scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:45.246842+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:46.247004+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:47.247557+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:16.401251+0000 osd.2 (osd.2) 108 : cluster [DBG] 3.5 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:16.415372+0000 osd.2 (osd.2) 109 : cluster [DBG] 3.5 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 109) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:16.401251+0000 osd.2 (osd.2) 108 : cluster [DBG] 3.5 scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:16.415372+0000 osd.2 (osd.2) 109 : cluster [DBG] 3.5 scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:48.247791+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:17.371499+0000 osd.2 (osd.2) 110 : cluster [DBG] 3.1e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:17.385573+0000 osd.2 (osd.2) 111 : cluster [DBG] 3.1e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 111) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:17.371499+0000 osd.2 (osd.2) 110 : cluster [DBG] 3.1e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:17.385573+0000 osd.2 (osd.2) 111 : cluster [DBG] 3.1e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:49.248087+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400327 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57933824 unmapped: 729088 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:50.248284+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57942016 unmapped: 720896 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:51.248430+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57950208 unmapped: 712704 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:52.248583+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.057037354s of 14.124808311s, submitted: 10
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:53.248757+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:22.467843+0000 osd.2 (osd.2) 112 : cluster [DBG] 7.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:22.481991+0000 osd.2 (osd.2) 113 : cluster [DBG] 7.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 113) v1
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:22.467843+0000 osd.2 (osd.2) 112 : cluster [DBG] 7.e scrub starts
Dec 01 09:39:27 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:22.481991+0000 osd.2 (osd.2) 113 : cluster [DBG] 7.e scrub ok
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:54.248955+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:55.249099+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:56.249253+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:57.249524+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:58.249738+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:59.249884+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:00.250050+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:01.250334+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:02.250641+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:03.250932+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:04.251092+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:05.251254+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 647168 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:06.251411+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:07.251690+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:08.252098+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:09.252378+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:10.252606+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:11.252767+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:12.253269+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:13.253586+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:14.254156+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:15.254497+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:16.254701+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:17.254907+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:18.255595+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:19.255792+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:20.255960+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:21.256171+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 598016 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:22.256363+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 589824 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:23.256514+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:24.256657+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:25.256839+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:26.257012+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:27.257230+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:28.257457+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:29.257589+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:30.257853+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:31.258101+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:32.258255+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:33.258419+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:34.258570+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58122240 unmapped: 540672 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:35.258732+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58130432 unmapped: 532480 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:36.258890+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:37.259054+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:38.259275+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:39.259419+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:40.259717+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:41.260086+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58155008 unmapped: 507904 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:42.260235+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58163200 unmapped: 499712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:43.260384+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:44.260664+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:45.260947+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:46.261169+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:47.261383+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:48.261654+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:49.261808+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:50.261992+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:51.262186+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:52.262347+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:53.262516+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:54.262753+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:55.262898+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:56.263101+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:57.263338+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:58.263638+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:59.263842+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:00.264059+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:01.264281+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:02.264631+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:03.264869+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:04.265023+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:05.265196+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:06.265445+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 417792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:07.265682+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:08.266042+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:09.266480+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:10.266674+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:11.266870+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:12.267060+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:13.267367+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:14.267508+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58277888 unmapped: 385024 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:15.267734+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58286080 unmapped: 376832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:16.267891+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:17.268108+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:18.268332+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:19.268553+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:20.269490+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:21.269666+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:22.269860+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:23.270033+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:24.270210+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 344064 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:25.270377+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58327040 unmapped: 335872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:26.270539+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:27.271285+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:28.271624+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:29.271812+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:30.271986+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:31.272162+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:32.272399+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:33.272618+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:34.272875+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:35.273122+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 286720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:36.273278+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:37.273449+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:38.273632+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:39.273812+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:40.274064+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:41.274206+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:42.274470+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:43.274701+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:44.274864+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:45.275135+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 237568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:46.275349+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:47.275518+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:48.275703+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:49.275847+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:50.276009+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:51.276376+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:52.276704+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:53.276955+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:54.277172+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:55.277429+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:56.277677+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:57.277913+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:58.278171+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:59.278427+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:00.278658+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:01.278871+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:02.283786+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:03.283914+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:04.284067+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:05.284250+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:06.284514+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:07.284794+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:08.284994+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:09.285348+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:10.285634+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:11.285842+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:12.286087+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:13.286262+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:14.286442+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:15.286609+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:16.286746+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:17.286890+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:18.287121+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:19.287348+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:20.287507+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:21.287759+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:22.287923+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:23.288121+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:24.288415+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:25.288582+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:26.288800+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:27.289031+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:28.289215+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:29.289423+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:30.289619+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:31.289823+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:32.290004+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:33.290232+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:34.291016+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:35.291205+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:36.291408+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:37.291655+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:38.291908+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:39.292132+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:40.292365+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:41.292583+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:42.292796+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 73728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:43.292936+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 65536 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:44.293103+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:45.293405+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:46.293539+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:47.293682+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:48.293985+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:49.294173+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:50.294354+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:51.294501+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:52.294647+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:53.294869+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:54.295024+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:55.295229+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:56.295402+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:57.295613+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:58.295840+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:59.295999+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:00.296453+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:01.296618+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:02.296778+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:03.296941+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:04.297153+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:05.297324+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:06.297504+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:07.297734+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:08.297992+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:09.298174+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:10.298366+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:11.298564+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:12.298715+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:13.298870+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:14.299020+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:15.299202+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:16.299354+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:17.299611+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:18.299842+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:19.300421+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:20.301030+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:21.301347+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:22.301843+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:23.302267+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:24.302628+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:25.302990+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:26.303169+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:27.303479+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:28.303786+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:29.304083+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:30.305049+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:31.305403+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:32.306173+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:33.306367+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:34.306535+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:35.306755+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:36.306890+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:37.307045+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:38.307240+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:39.308074+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:40.308433+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:41.308648+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:42.308865+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:43.309082+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:44.309218+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:45.309347+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:46.309484+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:47.309722+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:48.309998+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:49.310204+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:50.310384+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:51.310555+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:52.310691+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:53.310822+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:54.310972+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:55.311124+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:56.311362+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:57.311584+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:58.311811+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:59.312004+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:00.312303+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:01.312471+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:02.312588+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:03.312712+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:04.312904+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:05.313050+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:06.313196+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:07.313452+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:08.313663+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:09.313984+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:10.314126+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:11.314377+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:12.314521+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:13.314714+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:14.314848+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:15.315176+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:16.315354+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:17.315528+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:18.315704+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:19.315839+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:20.316014+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:21.316259+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:22.316447+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:23.316596+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:24.316722+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:25.316857+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:26.316986+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:27.317108+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:28.317307+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:29.317451+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58957824 unmapped: 753664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:30.317558+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:31.317665+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:32.317756+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:33.317874+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:34.317997+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:35.318151+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:36.318265+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:37.318434+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:38.318685+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:39.318852+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:40.319067+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:41.319262+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:42.319440+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:43.319567+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:44.319772+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:45.319894+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:46.320059+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:47.320263+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:48.320526+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:49.320681+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:50.320858+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:51.321021+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:52.321141+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:53.321275+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:54.321453+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:55.321628+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:56.321726+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:57.321880+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:58.322082+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:59.322309+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:00.322448+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:01.322578+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:02.322708+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:03.322830+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:04.322986+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:05.323151+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:06.323297+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:07.323419+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:08.323607+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:09.323753+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:10.323882+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:11.324020+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:12.324162+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:13.324299+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:14.324422+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:15.324559+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:16.324714+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:17.324947+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:18.325111+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:19.325234+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:20.325353+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:21.325503+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:22.325702+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:23.325837+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:24.326020+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:25.326204+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:26.326700+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:27.326905+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:28.327154+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:29.327377+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:30.327639+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:31.327896+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:32.328162+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:33.328347+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:34.328633+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:35.328895+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:36.329009+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:37.329199+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:38.329423+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:39.329880+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:40.330178+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:41.330392+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:42.330529+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:43.330668+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:44.330811+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:45.330996+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:46.331146+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:47.331319+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:48.331533+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:49.331751+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:50.331915+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:51.332172+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:52.332467+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:53.332778+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 15.87 MB, 0.03 MB/s
                                           Interval WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:54.333405+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:55.333598+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:56.333788+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:57.334062+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:58.334561+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:59.334735+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:00.334928+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:01.335094+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:02.335230+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:03.335372+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14764 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:04.335553+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:05.335710+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:06.335807+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:07.335961+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:08.336162+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:09.336316+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:10.336416+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:11.336558+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:12.336690+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:13.336818+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:14.336962+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:15.337107+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:16.337596+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:17.337767+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:18.337919+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:19.338153+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:20.338358+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:21.338537+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:22.338731+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:23.338894+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:24.339043+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:25.339259+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:26.339412+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:27.339547+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:28.339736+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:29.339887+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:30.340100+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:31.340360+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:32.340533+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:33.340688+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:34.340845+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:35.340986+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:36.341195+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:37.341381+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:38.341660+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:39.341826+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:40.341993+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:41.342469+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:42.342647+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:43.342825+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:44.342933+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:45.343157+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:46.343376+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:47.343607+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:48.343808+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:49.343963+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:50.344205+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:51.344392+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:52.344752+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:53.344892+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:54.345048+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:55.345398+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:56.345704+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:57.345863+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:58.346124+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:59.346363+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:00.346553+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:01.346708+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:02.346982+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:03.347154+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:04.347344+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:05.347522+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:06.347741+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:07.347890+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:08.348282+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:09.348477+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:10.348625+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:11.348773+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:12.348894+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:13.349048+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:14.349181+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:15.349380+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:16.349588+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:17.349710+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:18.349945+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:19.350096+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:20.350269+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:21.350437+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:22.350556+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:23.350689+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:24.350821+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:25.350965+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:26.351129+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:27.351225+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:28.351343+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:29.351461+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:30.351593+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:31.351717+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:32.351848+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:33.352015+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:34.352156+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:35.352308+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:36.352423+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:37.352541+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:38.352681+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:39.352861+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:40.353053+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:41.353209+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:42.353358+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:43.353537+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:44.353682+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:45.353870+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:46.354023+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:47.354268+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:48.354540+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:49.354733+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:50.354935+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:51.355053+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:52.355188+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:53.355394+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:54.355535+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:55.355668+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:56.355867+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:57.356017+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:58.356184+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:59.356322+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:00.356524+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:01.356661+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:02.356801+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:03.356953+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:04.357171+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:05.357382+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:06.357522+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:07.357682+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:08.357882+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:09.358032+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:10.358214+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:11.358356+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:12.358500+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:13.358696+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:14.358841+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:15.358978+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:16.359091+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:17.359229+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:18.359373+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:19.359489+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:20.359647+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:21.359794+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:22.359934+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:23.360066+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:24.360190+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:25.360339+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:26.360477+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:27.360621+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:28.360811+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:29.360975+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:30.361100+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:31.361233+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:32.361358+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:33.361511+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:34.361620+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:35.361739+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:36.361863+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:37.361991+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:38.362221+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:39.362394+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:40.362706+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:41.362848+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:42.362993+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:43.363165+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:44.363311+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:45.363434+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:46.363573+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:47.363703+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:48.364082+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:49.364285+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:50.364632+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:51.364811+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:52.364964+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:53.365091+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:54.365230+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:55.365354+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:56.365471+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:57.365603+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:58.366024+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:59.366224+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:00.366385+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:01.366509+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:02.366669+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:03.366845+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:04.367035+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:05.367210+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:06.367442+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:07.367617+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:08.367914+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:09.368119+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:10.368259+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:11.368423+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:12.368566+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:13.368747+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:14.368926+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:15.369068+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:16.369194+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:17.369366+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:18.369542+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:19.374683+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:20.374814+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:21.374934+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:22.375070+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:23.375232+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:24.375463+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:25.375608+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:26.375742+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:27.375917+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:28.376096+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:29.376275+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:30.376429+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:31.376590+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:32.376730+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:33.376871+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:34.376992+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:35.377121+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:36.377254+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:37.377423+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:38.377570+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:39.377732+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:40.378003+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:41.378170+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:42.378297+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:43.378420+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:44.378557+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:45.378708+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:46.379007+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:47.379328+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:48.379577+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:49.379980+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:50.380282+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:51.380724+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:52.380914+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:53.381066+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:54.381932+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:55.382049+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:56.382223+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:57.382361+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:58.382613+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:59.382742+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:00.382954+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:01.383096+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:02.383402+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:03.383548+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:04.383724+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:05.383871+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:06.384015+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:07.384166+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:08.384348+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:09.384593+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:10.384746+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:11.384985+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:12.385154+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:13.385336+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:14.385590+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:15.385802+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:16.385968+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:17.386148+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:18.386343+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:19.386525+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:20.386690+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:21.386858+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:22.386994+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:23.387135+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:24.387321+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:25.387476+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:26.387656+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:27.387807+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:28.388018+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:29.388261+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:30.388670+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:31.388878+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:32.389047+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:33.389209+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:34.389399+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:35.389536+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:36.389715+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:37.389849+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:38.390085+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:39.390215+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:40.390391+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:41.390563+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:42.390819+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:43.391007+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:44.391274+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:45.391466+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:46.391580+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:47.391726+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:48.391919+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:49.392045+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:50.392191+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:51.392333+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:52.392481+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:53.392668+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:54.392782+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:55.393090+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:56.393456+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:57.393685+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:58.394032+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:59.394340+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:00.394506+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:01.394728+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:02.394863+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:03.395090+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:04.395284+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: mgrc ms_handle_reset ms_handle_reset con 0x5595d67d3c00
Dec 01 09:39:27 compute-0 ceph-osd[90166]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec 01 09:39:27 compute-0 ceph-osd[90166]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: get_auth_request con 0x5595d7ac8c00 auth_method 0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: mgrc handle_mgr_configure stats_period=5
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:05.395420+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:06.395630+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:07.395766+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:08.396007+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:09.396253+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:10.396481+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:11.396687+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:12.396902+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:13.397105+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:14.397308+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:15.397446+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:16.397624+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:17.397805+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:18.397986+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:19.398121+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:20.398259+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:21.398755+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:22.399363+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:23.401236+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:24.401918+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:25.402077+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:26.402304+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:27.402668+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:28.402798+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:29.402975+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:30.403226+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:31.403911+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:32.404059+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:33.404246+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:34.404434+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:35.404698+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:36.404814+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:37.404969+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:38.405230+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:39.405384+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:40.405524+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:41.405696+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:42.405875+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:43.406114+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:44.406259+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:45.406460+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:46.406667+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:47.406794+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:48.407047+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:49.407241+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:50.407357+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:51.407456+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:52.407618+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:53.407803+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:54.408081+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:55.408395+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:56.408825+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:57.409173+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:58.409369+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:59.409539+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:00.409889+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 ms_handle_reset con 0x5595d6fa8c00 session 0x5595d72c4960
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9203c00
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:01.410095+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:02.410272+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:03.410490+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:04.410652+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:05.410808+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:06.410938+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:07.411086+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:08.411305+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:09.411482+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:10.411615+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:11.411777+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:12.411925+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:13.412088+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:14.412347+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:15.412533+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:16.412746+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:17.413136+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:18.413390+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:19.413630+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:20.413871+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:21.414107+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:22.414235+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:23.414360+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:24.414683+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:25.414834+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:26.415065+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:27.415222+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:28.415390+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:29.415598+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:30.415744+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:31.415892+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:32.416031+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:33.416200+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:34.416402+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:35.416535+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:36.416824+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:37.417002+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:38.417197+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:39.417400+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:40.417568+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:41.417748+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:42.417892+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:43.418074+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:44.418216+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:45.418562+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:46.418671+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:47.418832+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:48.418966+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:49.419160+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:50.419318+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:51.419465+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:52.419831+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:53.419997+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:54.420157+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:55.420309+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:56.420440+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:57.420628+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:58.422779+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:59.423061+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:00.423220+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:01.423433+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:02.423795+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:03.423955+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:04.424091+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:05.424257+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:06.424408+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:07.424551+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:08.424747+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:09.425585+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:10.425734+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:11.425944+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:12.426081+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:13.426692+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:14.427137+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:15.427354+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:16.427548+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:17.427676+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:18.427895+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:19.428236+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:20.428441+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:21.428593+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:22.428869+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:23.429017+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:24.429370+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:25.429679+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:26.429945+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:27.430261+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:28.430513+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:29.430700+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:30.430985+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:31.431117+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:32.431270+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:33.431470+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:34.431648+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:35.431820+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:36.431975+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:37.432128+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:38.432303+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:39.432714+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:40.432851+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:41.432957+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:42.433147+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:43.433258+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:44.433380+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:45.433558+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:46.433734+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:47.433958+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:48.434579+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:49.434832+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:50.436192+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:51.436434+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:52.436633+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:53.437048+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:54.437354+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:55.437558+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:56.437703+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:57.438081+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:58.438360+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:59.438611+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:00.438783+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:01.438920+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:02.439993+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:03.440413+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:04.444590+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:05.446066+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:06.446599+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:07.447112+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:08.447798+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:09.448741+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:10.448870+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:11.449081+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:12.449399+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:13.449658+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:14.450383+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:15.450713+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:16.451009+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:17.451260+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:18.451484+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:19.451812+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:20.452233+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:21.452571+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:22.452887+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:23.453122+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:24.453376+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:25.453687+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:26.453842+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:27.454039+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:28.454524+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:29.454854+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:30.455175+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:31.455346+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:32.455555+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:33.455749+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:34.455897+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:35.456048+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:36.456188+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:37.456352+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:38.456507+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:39.456674+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:40.456999+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:41.457264+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:42.458476+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:43.458714+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:44.459003+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:45.459395+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:46.459668+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:47.459934+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:48.460237+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:49.460460+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:50.460781+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:51.461022+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:52.461257+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:53.461661+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:54.461967+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:55.462198+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:56.462429+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:57.462851+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:58.463140+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:59.463458+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:00.463696+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:01.464000+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:02.464361+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:03.464583+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:27 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:27 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:04.464874+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:27 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:05.465069+0000)
Dec 01 09:39:27 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:06.465254+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:07.465502+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:08.465895+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:09.466145+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:10.466425+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:11.466634+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:12.466800+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:13.466956+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:14.467120+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:15.467405+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:16.467656+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:17.467872+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:18.468119+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:19.468418+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:20.468651+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:21.468810+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:22.468964+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:23.469124+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:24.469326+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:25.469561+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:26.469814+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:27.470040+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:28.470360+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:29.470562+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:30.470814+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:31.471027+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:32.471341+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:33.471639+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:34.471931+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:35.472171+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:36.472504+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:37.472807+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:38.473124+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:39.473433+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:40.473709+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:41.474053+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:42.474428+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:43.474638+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:44.474988+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:45.475252+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:46.475516+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:47.475755+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:48.476083+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:49.476404+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:50.476650+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:51.476924+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:52.477204+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:53.477418+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:54.477643+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:55.477878+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:56.478152+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:57.478371+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:58.478671+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:59.478880+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:00.479099+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:01.479545+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:02.479871+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:03.480116+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:04.480455+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:05.480811+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:06.481018+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:07.481227+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:08.481711+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:09.481935+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:10.482377+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:11.483271+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:12.483668+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:13.483954+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:14.484208+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:15.484518+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:16.484806+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:17.485036+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:18.485437+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:19.485718+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:20.486030+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:21.486544+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:22.486816+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:23.487082+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:24.487283+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:25.487478+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:26.487662+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:27.487794+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:28.488509+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:29.488687+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:30.488988+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:31.489189+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:32.489370+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:33.489640+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:34.489877+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:35.490047+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:36.490181+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:37.490530+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:38.490772+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:39.490972+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:40.491164+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:41.491340+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:42.491502+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:43.491661+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:44.491808+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:45.491970+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:46.492201+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:47.492394+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:48.492595+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:49.492755+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:50.492926+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:51.493275+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:52.493538+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:53.493753+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:54.494119+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:55.494396+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:56.494657+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:57.494843+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:58.495115+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:59.495382+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:00.495694+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:01.495970+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:02.496173+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:03.496482+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:04.496809+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:05.497061+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:06.497318+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:07.497524+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:08.497804+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:09.498095+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:10.498341+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:11.498567+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:12.498800+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:13.498970+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:14.499193+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:15.499487+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:16.499742+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:17.500030+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:18.500423+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:19.500836+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:20.501107+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:21.501432+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:22.501727+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:23.502269+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:24.502618+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:25.502852+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:26.503141+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:27.503455+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:28.503954+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:29.504167+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:30.504538+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:31.504812+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:32.505000+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:33.505222+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:34.505395+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:35.505579+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:36.505770+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:37.505989+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:38.506145+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:39.506384+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:40.506679+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:41.506876+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:42.507068+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:43.507362+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:44.507572+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:45.507835+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:46.508207+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:47.508570+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:48.508856+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:49.509158+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 47 handle_osd_map epochs [48,48], i have 47, src has [1,48]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1077.389038086s of 1077.396484375s, submitted: 2
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 48 heartbeat osd_stat(store_statfs(0x4fe14e000/0x0/0x4ffc00000, data 0x2f8bc/0x7f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:50.509377+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 48 handle_osd_map epochs [49,49], i have 48, src has [1,49]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 9977856 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:51.509753+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 49 handle_osd_map epochs [50,50], i have 49, src has [1,50]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 50 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61227008 unmapped: 8847360 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:52.509989+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 50 heartbeat osd_stat(store_statfs(0x4fd065000/0x0/0x4ffc00000, data 0x11124f6/0x1168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61538304 unmapped: 16932864 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:53.510477+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 50 handle_osd_map epochs [51,51], i have 50, src has [1,51]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 51 ms_handle_reset con 0x5595d8136c00 session 0x5595d6e51a40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:54.510823+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:55.511208+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:56.511466+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:57.511808+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:58.512097+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:59.512431+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:00.512719+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:01.513385+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 52, src has [1,52]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.469475746s of 11.739644051s, submitted: 54
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:02.513580+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:03.514081+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:04.514219+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:05.514421+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:06.514597+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:07.514759+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:08.515017+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:09.515175+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:10.515380+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:11.515573+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:12.515712+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:13.515896+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:14.516069+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:15.516272+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:16.516544+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:17.516791+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:18.516994+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:19.517135+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:20.517401+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:21.518043+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:22.519084+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:23.519675+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:24.520190+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:25.520856+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.064493179s of 24.076759338s, submitted: 13
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f8f/0x116f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137000 session 0x5595d8062f00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:26.521168+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61677568 unmapped: 16793600 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137800 session 0x5595d81a34a0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:27.521480+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:28.521995+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [53,54], i have 53, src has [1,54]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [54,54], i have 54, src has [1,54]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 555402 data_alloc: 218103808 data_used: 45056
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d7ac9400 session 0x5595d81a3e00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:29.522364+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 15663104 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137000 session 0x5595d80623c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137400 session 0x5595d6e50f00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:30.522716+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 15646720 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 54 heartbeat osd_stat(store_statfs(0x4fd057000/0x0/0x4ffc00000, data 0x1117b13/0x1176000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 54 handle_osd_map epochs [54,55], i have 54, src has [1,55]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:31.522872+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 55 ms_handle_reset con 0x5595d8137c00 session 0x5595d6e512c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:32.523137+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:33.523263+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 56 ms_handle_reset con 0x5595d9b9e800 session 0x5595d80921e0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 14262272 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 56 heartbeat osd_stat(store_statfs(0x4fd055000/0x0/0x4ffc00000, data 0x1119111/0x1178000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 559201 data_alloc: 218103808 data_used: 45056
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:34.523646+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 14229504 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 56 handle_osd_map epochs [56,57], i have 56, src has [1,57]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,57], i have 57, src has [1,57]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:35.523858+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 57 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8075680
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:36.524199+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.944048882s of 11.445683479s, submitted: 143
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9ec00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,58], i have 57, src has [1,58]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [58,58], i have 58, src has [1,58]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:37.524362+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729eb40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a52c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 21086208 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [58,59], i have 58, src has [1,59]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [59,59], i have 59, src has [1,59]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:38.524523+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 59 ms_handle_reset con 0x5595d8137c00 session 0x5595d7b001e0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 20922368 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 59 heartbeat osd_stat(store_statfs(0x4fc84b000/0x0/0x4ffc00000, data 0x191d2db/0x1982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854268 data_alloc: 218103808 data_used: 61440
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:39.524726+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8137400 session 0x5595d7bae780
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8136c00 session 0x5595d80785a0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 20774912 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 60 handle_osd_map epochs [60,61], i have 60, src has [1,61]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 61 ms_handle_reset con 0x5595d8137400 session 0x5595d73d83c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:40.524886+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 61 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x391fecd/0x398a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66355200 unmapped: 20512768 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d8137c00 session 0x5595d8197860
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:41.525251+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729e960
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 20258816 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9ec00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d8137000 session 0x5595d729f680
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9e800 session 0x5595d81963c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 63 heartbeat osd_stat(store_statfs(0x4fc83a000/0x0/0x4ffc00000, data 0x11237d9/0x1192000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d80a4b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:42.525352+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 63 handle_osd_map epochs [63,64], i have 63, src has [1,64]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 64 heartbeat osd_stat(store_statfs(0x4fc426000/0x0/0x4ffc00000, data 0x1124dd4/0x1194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:43.525472+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 64 handle_osd_map epochs [64,65], i have 64, src has [1,65]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b16d20
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8079a40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8137400 session 0x5595d8196b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 613840 data_alloc: 218103808 data_used: 122880
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:44.525631+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 65 heartbeat osd_stat(store_statfs(0x4fc422000/0x0/0x4ffc00000, data 0x11263ba/0x1197000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 18694144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:45.525796+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 66 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 18743296 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [66,67], i have 66, src has [1,67]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [67,67], i have 67, src has [1,67]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:46.525961+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 67 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8092b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 18735104 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 67 handle_osd_map epochs [67,68], i have 67, src has [1,68]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.158089638s of 10.213048935s, submitted: 251
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:47.526078+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 18481152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 68 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01680
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9ec00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9e800 session 0x5595d82c2f00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:48.526267+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729e3c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 18423808 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fcc19000/0x0/0x4ffc00000, data 0x112ce2f/0x11a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 630316 data_alloc: 218103808 data_used: 139264
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:49.526408+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729eb40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d8136c00 session 0x5595d729fc20
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 18243584 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:50.526534+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137400 session 0x5595d8196f00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:51.526689+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:52.526846+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d9b9e800 session 0x5595d73e14a0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137c00 session 0x5595d73e1680
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fcc12000/0x0/0x4ffc00000, data 0x112f364/0x11a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:53.527043+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:54.527279+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 634276 data_alloc: 218103808 data_used: 139264
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [72,73], i have 72, src has [1,73]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [73,73], i have 73, src has [1,73]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 73 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d8f00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:55.527505+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fcc10000/0x0/0x4ffc00000, data 0x1130982/0x11ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x113204f/0x11b0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:56.527653+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:57.527796+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:58.527977+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.005791664s of 11.874329567s, submitted: 231
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:59.528100+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 74 ms_handle_reset con 0x5595d9b9c800 session 0x5595d739d860
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 640656 data_alloc: 218103808 data_used: 151552
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 17956864 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 74 handle_osd_map epochs [74,75], i have 74, src has [1,75]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c400 session 0x5595d81974a0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:00.528373+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 17948672 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:01.528514+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c000 session 0x5595d80a4d20
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 17915904 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7d3a780
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:02.528707+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 75 handle_osd_map epochs [75,76], i have 75, src has [1,76]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 76 ms_handle_reset con 0x5595d8136c00 session 0x5595d72b8b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 17793024 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:03.528915+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 17768448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 77 ms_handle_reset con 0x5595d9b9c000 session 0x5595d729e3c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:04.529101+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 654076 data_alloc: 218103808 data_used: 155648
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 17612800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 78 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:05.529362+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:06.529563+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:07.529916+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c800 session 0x5595d72c52c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:08.530170+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d92c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8197680
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c400 session 0x5595d729e960
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d72ad000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d72ad000 session 0x5595d80a5a40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a4960
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01e00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:09.530374+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 665586 data_alloc: 218103808 data_used: 172032
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d81a3a40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 79 handle_osd_map epochs [79,80], i have 79, src has [1,80]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.607955933s of 11.038110733s, submitted: 98
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7348800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:10.531083+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d7348800 session 0x5595d7b165a0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:11.531245+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba57000/0x0/0x4ffc00000, data 0x113b8a0/0x11c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:12.531434+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:13.531602+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:14.531757+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 667454 data_alloc: 218103808 data_used: 172032
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c000 session 0x5595d7bae3c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:15.531910+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 16637952 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b00780
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598400 session 0x5595d8062f00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d7b00f00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:16.532072+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598800
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598800 session 0x5595d80743c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 16588800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d72b90e0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:17.532218+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 82 heartbeat osd_stat(store_statfs(0x4fba54000/0x0/0x4ffc00000, data 0x113ce5a/0x11c9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70328320 unmapped: 16539648 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598400 session 0x5595d73e0d20
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:18.532384+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 16515072 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba50000/0x0/0x4ffc00000, data 0x113e468/0x11cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c000 session 0x5595d739d0e0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c400 session 0x5595d6de4d20
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598c00 session 0x5595d7b16b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:19.532537+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 679530 data_alloc: 218103808 data_used: 172032
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 16531456 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.183552742s of 10.257410049s, submitted: 36
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:20.532701+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598000 session 0x5595d7b165a0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:21.532815+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba4e000/0x0/0x4ffc00000, data 0x113fa80/0x11d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 16490496 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:22.532949+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d9598400 session 0x5595d72c52c0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:23.533124+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7b170e0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d9860
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:24.533342+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 680355 data_alloc: 218103808 data_used: 192512
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 84 handle_osd_map epochs [84,85], i have 84, src has [1,85]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 84 heartbeat osd_stat(store_statfs(0x4fba4c000/0x0/0x4ffc00000, data 0x1141058/0x11d1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:25.533523+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:26.533639+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 86 heartbeat osd_stat(store_statfs(0x4fba45000/0x0/0x4ffc00000, data 0x1143b22/0x11d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:27.533768+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:28.533925+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d8137400 session 0x5595d73d8780
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8196960
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:29.534654+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 687015 data_alloc: 218103808 data_used: 192512
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d9a40
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:30.534825+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:31.534987+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.584367752s of 11.068427086s, submitted: 87
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 87 ms_handle_reset con 0x5595d8136c00 session 0x5595d80634a0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 87 heartbeat osd_stat(store_statfs(0x4fba43000/0x0/0x4ffc00000, data 0x1144fde/0x11da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 16449536 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 88 ms_handle_reset con 0x5595d9598000 session 0x5595d7bae1e0
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:32.535137+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:33.535277+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:34.535488+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690967 data_alloc: 218103808 data_used: 196608
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:35.535630+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:36.535770+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:37.537730+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:38.538731+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:39.538886+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 693939 data_alloc: 218103808 data_used: 196608
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:40.539066+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:41.539214+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:42.539347+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.918384552s of 11.027014732s, submitted: 76
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3e000/0x0/0x4ffc00000, data 0x1147a76/0x11df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 16359424 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:43.539501+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:44.539696+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:45.539866+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:46.540011+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:47.540173+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:48.540363+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:49.540526+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:50.540696+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:51.540830+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:52.540963+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:53.541160+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:54.541360+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:55.541502+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:56.541679+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:57.541836+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:58.541989+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:59.542319+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:00.542477+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:01.542645+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:02.542784+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:03.542920+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:04.543106+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:05.543219+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:06.543413+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:07.543604+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:08.543947+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:09.544098+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:10.544250+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:11.544445+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:12.544582+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:13.544815+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:14.545104+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:15.545265+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:16.545459+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:17.545617+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:18.545943+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:19.546103+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:20.546304+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:21.546477+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:22.546653+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:23.546796+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:24.546975+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:25.547105+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:26.547328+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:27.547600+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:28.547872+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:29.548026+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:30.548431+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:31.548787+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:32.549062+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:33.549277+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:34.549485+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:35.549635+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:36.549924+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:37.550209+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:38.550520+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:39.550831+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:40.551029+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:41.551772+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:42.551946+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:43.552129+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:44.552781+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:45.553125+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:46.553508+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:47.553720+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:48.553972+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:49.554182+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:50.554369+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:51.554527+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:52.554655+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:53.554773+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:54.554906+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:28 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:28 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'config show' '{prefix=config show}'
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 16056320 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:55.555040+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:39:28 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 15605760 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:56.555203+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 15720448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:39:28 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:57.555361+0000)
Dec 01 09:39:28 compute-0 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:39:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec 01 09:39:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3916327048' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mon[75031]: from='client.14754 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mon[75031]: from='client.14756 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1132193302' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2888214778' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3916327048' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14768 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec 01 09:39:28 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2405876524' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:39:28 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14772 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:28 compute-0 sudo[260421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:28 compute-0 sudo[260421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:28 compute-0 sudo[260421]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:28 compute-0 sudo[260454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:39:28 compute-0 sudo[260454]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:28 compute-0 sudo[260454]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:28 compute-0 sudo[260481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:28 compute-0 sudo[260481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:28 compute-0 sudo[260481]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:28 compute-0 sudo[260535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:39:28 compute-0 sudo[260535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:29 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14776 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2668326975' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: from='client.14760 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: pgmap v868: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:29 compute-0 ceph-mon[75031]: from='client.14764 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: from='client.14768 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2405876524' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2668326975' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #42. Immutable memtables: 0.
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.228024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 42
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969228060, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 617, "num_deletes": 251, "total_data_size": 445464, "memory_usage": 456960, "flush_reason": "Manual Compaction"}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #43: started
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969235153, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 43, "file_size": 439195, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17564, "largest_seqno": 18180, "table_properties": {"data_size": 435845, "index_size": 1194, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7966, "raw_average_key_size": 19, "raw_value_size": 429149, "raw_average_value_size": 1046, "num_data_blocks": 55, "num_entries": 410, "num_filter_entries": 410, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581926, "oldest_key_time": 1764581926, "file_creation_time": 1764581969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 43, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 7476 microseconds, and 3258 cpu microseconds.
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.235490) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #43: 439195 bytes OK
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.235627) [db/memtable_list.cc:519] [default] Level-0 commit table #43 started
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237381) [db/memtable_list.cc:722] [default] Level-0 commit table #43: memtable #1 done
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237460) EVENT_LOG_v1 {"time_micros": 1764581969237393, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237487) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 442046, prev total WAL file size 442046, number of live WAL files 2.
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000039.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.238072) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [43(428KB)], [41(5337KB)]
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969238132, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [43], "files_L6": [41], "score": -1, "input_data_size": 5905070, "oldest_snapshot_seqno": -1}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #44: 3772 keys, 4740877 bytes, temperature: kUnknown
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969274725, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 44, "file_size": 4740877, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4715809, "index_size": 14527, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 89955, "raw_average_key_size": 23, "raw_value_size": 4648025, "raw_average_value_size": 1232, "num_data_blocks": 622, "num_entries": 3772, "num_filter_entries": 3772, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764581969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.274951) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 4740877 bytes
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.276318) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.1 rd, 129.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 5.2 +0.0 blob) out(4.5 +0.0 blob), read-write-amplify(24.2) write-amplify(10.8) OK, records in: 4281, records dropped: 509 output_compression: NoCompression
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.276334) EVENT_LOG_v1 {"time_micros": 1764581969276326, "job": 20, "event": "compaction_finished", "compaction_time_micros": 36664, "compaction_time_cpu_micros": 12635, "output_level": 6, "num_output_files": 1, "total_output_size": 4740877, "num_input_records": 4281, "num_output_records": 3772, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000043.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969276503, "job": 20, "event": "table_file_deletion", "file_number": 43}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764581969277491, "job": 20, "event": "table_file_deletion", "file_number": 41}
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.237953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277559) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277566) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277568) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:39:29 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:39:29.277573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:39:29 compute-0 sudo[260535]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:39:29 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 5f0fb47a-a6a2-4329-86d7-5fba4d7a0a45 does not exist
Dec 01 09:39:29 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 97aff81c-14fd-4ab5-bff5-f0f260a84c58 does not exist
Dec 01 09:39:29 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 8bf75225-2d95-490c-b709-2f43b018763e does not exist
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:39:29 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14778 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:29 compute-0 sudo[260656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:29 compute-0 sudo[260656]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:29 compute-0 sudo[260656]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:29 compute-0 sudo[260687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:39:29 compute-0 sudo[260687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:29 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Dec 01 09:39:29 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1834446799' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 09:39:29 compute-0 sudo[260687]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:29 compute-0 sudo[260714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:29 compute-0 sudo[260714]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:29 compute-0 sudo[260714]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:29 compute-0 sudo[260759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:39:29 compute-0 sudo[260759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v869: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:30 compute-0 podman[260846]: 2025-12-01 09:39:30.033679988 +0000 UTC m=+0.049410923 container create f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:39:30 compute-0 systemd[1]: Started libpod-conmon-f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e.scope.
Dec 01 09:39:30 compute-0 podman[260846]: 2025-12-01 09:39:30.010839121 +0000 UTC m=+0.026570086 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:39:30 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:39:30 compute-0 podman[260846]: 2025-12-01 09:39:30.135965121 +0000 UTC m=+0.151696086 container init f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:39:30 compute-0 podman[260846]: 2025-12-01 09:39:30.145199547 +0000 UTC m=+0.160930482 container start f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:39:30 compute-0 podman[260846]: 2025-12-01 09:39:30.150777988 +0000 UTC m=+0.166508923 container attach f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True)
Dec 01 09:39:30 compute-0 wonderful_sutherland[260877]: 167 167
Dec 01 09:39:30 compute-0 systemd[1]: libpod-f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e.scope: Deactivated successfully.
Dec 01 09:39:30 compute-0 podman[260846]: 2025-12-01 09:39:30.152000513 +0000 UTC m=+0.167731448 container died f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:39:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-d9069dcc5b9a23f5b88a6637c41b1d627b12407dd0614ed311ca22345ad7c86f-merged.mount: Deactivated successfully.
Dec 01 09:39:30 compute-0 podman[260846]: 2025-12-01 09:39:30.198730377 +0000 UTC m=+0.214461332 container remove f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wonderful_sutherland, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:39:30 compute-0 systemd[1]: libpod-conmon-f9140dce348748266fc7f05550308fad1d7c20db33be1191671dec20d367d62e.scope: Deactivated successfully.
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='client.14772 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='client.14776 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='client.14778 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1834446799' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 09:39:30 compute-0 podman[260959]: 2025-12-01 09:39:30.382776163 +0000 UTC m=+0.055996072 container create 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:39:30 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14786 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:39:30.396+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:39:30 compute-0 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:39:30 compute-0 systemd[1]: Started libpod-conmon-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope.
Dec 01 09:39:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "node ls"} v 0) v1
Dec 01 09:39:30 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1795086165' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 09:39:30 compute-0 podman[260959]: 2025-12-01 09:39:30.362011066 +0000 UTC m=+0.035231005 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:39:30 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:39:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:30 compute-0 podman[260959]: 2025-12-01 09:39:30.496006382 +0000 UTC m=+0.169226311 container init 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:39:30 compute-0 podman[260959]: 2025-12-01 09:39:30.505579217 +0000 UTC m=+0.178799156 container start 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:39:30 compute-0 podman[260959]: 2025-12-01 09:39:30.509823159 +0000 UTC m=+0.183043098 container attach 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Dec 01 09:39:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:30 compute-0 crontab[261032]: (root) LIST (root)
Dec 01 09:39:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Dec 01 09:39:30 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/506962092' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 09:39:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Dec 01 09:39:30 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/276962131' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 09:39:31 compute-0 nova_compute[250706]: 2025-12-01 09:39:31.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:31 compute-0 nova_compute[250706]: 2025-12-01 09:39:31.058 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Dec 01 09:39:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/80551184' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Dec 01 09:39:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3354255277' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: pgmap v869: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:31 compute-0 ceph-mon[75031]: from='client.14786 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1795086165' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/506962092' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/276962131' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/80551184' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3354255277' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Dec 01 09:39:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3228762746' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 09:39:31 compute-0 eloquent_rhodes[260993]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:39:31 compute-0 eloquent_rhodes[260993]: --> relative data size: 1.0
Dec 01 09:39:31 compute-0 eloquent_rhodes[260993]: --> All data devices are unavailable
Dec 01 09:39:31 compute-0 systemd[1]: libpod-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope: Deactivated successfully.
Dec 01 09:39:31 compute-0 systemd[1]: libpod-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope: Consumed 1.093s CPU time.
Dec 01 09:39:31 compute-0 podman[260959]: 2025-12-01 09:39:31.694280532 +0000 UTC m=+1.367500451 container died 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:39:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-359e8c412c40f675ba46edf6017c90c3b5f808ca55f5817c181335d5126d0cab-merged.mount: Deactivated successfully.
Dec 01 09:39:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Dec 01 09:39:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2676379662' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 09:39:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v870: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:31 compute-0 podman[260959]: 2025-12-01 09:39:31.751835898 +0000 UTC m=+1.425055817 container remove 6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_rhodes, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:39:31 compute-0 systemd[1]: libpod-conmon-6d837dc40f1d9aa9a94acf89cc5a87e21629db7d31d71b5a53a98a67868297bd.scope: Deactivated successfully.
Dec 01 09:39:31 compute-0 sudo[260759]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:31 compute-0 sudo[261244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:31 compute-0 sudo[261244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:31 compute-0 sudo[261244]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:31 compute-0 sudo[261292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:39:31 compute-0 sudo[261292]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:31 compute-0 sudo[261292]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:31 compute-0 sudo[261334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:31 compute-0 sudo[261334]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:31 compute-0 sudo[261334]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:32 compute-0 sudo[261361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:39:32 compute-0 sudo[261361]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Dec 01 09:39:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1721211012' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 09:39:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Dec 01 09:39:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/895605000' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 09:39:32 compute-0 podman[261473]: 2025-12-01 09:39:32.424382991 +0000 UTC m=+0.050577137 container create cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:39:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Dec 01 09:39:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/356448226' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 09:39:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3228762746' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 09:39:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2676379662' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 09:39:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1721211012' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 09:39:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/895605000' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 09:39:32 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/356448226' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 09:39:32 compute-0 systemd[1]: Started libpod-conmon-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope.
Dec 01 09:39:32 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:39:32 compute-0 podman[261473]: 2025-12-01 09:39:32.397475736 +0000 UTC m=+0.023669902 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:39:32 compute-0 podman[261473]: 2025-12-01 09:39:32.508495351 +0000 UTC m=+0.134689517 container init cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:39:32 compute-0 podman[261473]: 2025-12-01 09:39:32.516084489 +0000 UTC m=+0.142278635 container start cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:39:32 compute-0 podman[261473]: 2025-12-01 09:39:32.519386365 +0000 UTC m=+0.145580531 container attach cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:39:32 compute-0 systemd[1]: libpod-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope: Deactivated successfully.
Dec 01 09:39:32 compute-0 quizzical_grothendieck[261491]: 167 167
Dec 01 09:39:32 compute-0 conmon[261491]: conmon cf4e6739ea1eba3ffcf1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope/container/memory.events
Dec 01 09:39:32 compute-0 podman[261473]: 2025-12-01 09:39:32.525361456 +0000 UTC m=+0.151555602 container died cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:39:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-ed3e3a11e3c3af09fd8c8fc1a658f1b132ac95d5d23fbde8cd297b01876b0c3c-merged.mount: Deactivated successfully.
Dec 01 09:39:32 compute-0 podman[261473]: 2025-12-01 09:39:32.568697123 +0000 UTC m=+0.194891269 container remove cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_grothendieck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:39:32 compute-0 systemd[1]: libpod-conmon-cf4e6739ea1eba3ffcf1a8d882ccf3908825521076ba361ddd0c89cabd8e8329.scope: Deactivated successfully.
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 1 0.000023
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000009
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000036 1 0.000030
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000030 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000016
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000031 1 0.000024
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000012 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 1 0.000022
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000016 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000049 1 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000011 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000016
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000040 1 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000011 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 1 0.000023
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000012 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000009
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000054 1 0.000511
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000047 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000018
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000079 1 0.000030
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000032 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000020
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000109 1 0.000037
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000105 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000020 1 0.000039
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000151 1 0.000055
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000014
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000053 1 0.000036
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000024 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000014
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000040 1 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000022 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000008
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000053 1 0.000027
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.001060 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000024
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000076 1 0.000057
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000042 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000017
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000100 1 0.000035
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000032 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000016
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000069 1 0.000034
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000034 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000024
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000046 1 0.000030
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b(unlocked)] enter Initial
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000080 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000022
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000124 1 0.000086
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.333077 1 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.336865 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.893488 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.893512 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666628838s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502311707s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] exit Reset 0.000108 1 0.000146
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666577339s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502311707s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011075 2 0.000030
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.954357 15 0.000117
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.964609 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.964710 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.964750 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045631409s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881553650s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] exit Reset 0.000053 1 0.000056
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1694800268' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045609474s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881553650s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.953814 15 0.001072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.964654 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.964818 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.964864 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045528412s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881576538s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] exit Reset 0.000026 1 0.000045
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045514107s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881576538s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.333638 1 0.000031
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337307 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.895408 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.895424 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666149139s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502319336s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] exit Reset 0.000054 1 0.000076
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.666124344s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502319336s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.954771 15 0.000146
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965101 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.965172 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.965197 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045070648s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881378174s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] exit Reset 0.000039 1 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045049667s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881378174s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.954879 15 0.000087
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965189 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.965337 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.965369 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045023918s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881462097s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] exit Reset 0.000039 1 0.000060
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.045001030s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881462097s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.334004 1 0.000027
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337607 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.900750 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.900795 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665835381s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502403259s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] exit Reset 0.000029 1 0.000046
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665820122s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502403259s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.955301 15 0.000096
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965713 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.965818 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.965864 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044493675s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881225586s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] exit Reset 0.000026 1 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044479370s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881225586s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.955515 15 0.000087
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.965890 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966026 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.966048 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044302940s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881187439s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] exit Reset 0.000027 1 0.000051
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.044287682s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881187439s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.334066 1 0.000031
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337831 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.895928 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.895942 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665797234s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.502769470s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] exit Reset 0.000031 1 0.000043
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.665776253s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.502769470s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.955592 15 0.000119
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.966157 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966265 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.966291 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043943405s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Reset 0.000039 1 0.000039
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043929100s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.956014 15 0.000236
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.966411 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966570 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.966614 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043830872s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881057739s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] exit Reset 0.000048 1 0.000071
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043811798s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881057739s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.327971 1 0.000040
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.336448 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.898669 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898693 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671725273s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509086609s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] exit Reset 0.000046 1 0.000081
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.671697617s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509086609s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.328889 1 0.000177
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.337779 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.899026 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670864105s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508338928s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] exit Reset 0.000024 1 0.000039
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670849800s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508338928s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.956459 15 0.000061
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.966799 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.966979 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.967012 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043417931s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880981445s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] exit Reset 0.000022 1 0.000045
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.043404579s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880981445s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329534 1 0.000047
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338024 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.898587 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898614 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670290947s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508003235s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] exit Reset 0.000047 1 0.000069
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670258522s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508003235s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329537 1 0.000042
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338056 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.897494 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.897510 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670177460s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508010864s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] exit Reset 0.000025 1 0.000041
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670162201s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508010864s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329808 1 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338125 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899730 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.899751 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670031548s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508018494s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] exit Reset 0.000034 1 0.000050
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.670013428s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508018494s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.957691 15 0.000066
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968302 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.968368 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.968397 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329858 1 0.000034
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338282 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.900450 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.957690 15 0.000073
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.900470 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968317 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.968435 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.968476 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042176247s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880577087s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669865608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508308411s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042231560s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Reset 0.000096 1 0.000117
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042157173s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] exit Reset 0.000208 1 0.000232
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] exit Reset 0.000141 1 0.000180
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] exit Start 0.000017 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.042071342s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880577087s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] exit Start 0.000033 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669753075s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508308411s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.957996 15 0.000137
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968799 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.968855 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.968878 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041958809s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880676270s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014513 2 0.000057
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Reset 0.000047 1 0.000068
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] exit Start 0.000017 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041932106s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880676270s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330342 1 0.000043
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338888 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014302 2 0.000050
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901479 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015063 2 0.000778
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901542 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669230461s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508064270s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013925 2 0.000026
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013346 2 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012307 2 0.000060
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330203 1 0.000028
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] exit Reset 0.000047 1 0.000078
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338716 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901442 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958032 15 0.000286
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901463 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.968537 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.969404 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330235 1 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669622421s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508628845s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338727 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901333 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.969436 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901355 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] exit Reset 0.000054 1 0.000090
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041983604s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.881034851s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669596672s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508628845s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669599533s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508674622s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Reset 0.000070 1 0.000125
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958809 15 0.000086
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041935921s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.881034851s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] exit Reset 0.000102 1 0.000132
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969376 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.969565 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] exit Start 0.000012 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669544220s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508674622s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.969600 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041315079s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880500793s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.329673 1 0.000034
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.338879 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.901864 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] exit Reset 0.000040 1 0.000096
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.901927 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] exit Start 0.000279 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958986 15 0.000109
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669204712s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508064270s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041291237s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880500793s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969766 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.969863 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669652939s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508903503s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.969889 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] exit Reset 0.000105 1 0.000138
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041206360s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880538940s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330237 1 0.000025
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] exit Reset 0.000043 1 0.000105
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339037 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899276 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.958880 15 0.000081
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.899296 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969565 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] exit Start 0.000014 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.970869 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041181564s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880538940s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669507027s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508911133s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669572830s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508903503s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] exit Reset 0.000055 1 0.000085
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.970999 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669469833s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508911133s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959140 15 0.000112
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.970904 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041090965s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880569458s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971056 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971078 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330365 1 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] exit Reset 0.000048 1 0.000123
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339081 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040925980s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880439758s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] exit Start 0.000017 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.899977 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.041068077s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880569458s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.900019 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] exit Reset 0.000057 1 0.000079
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330234 1 0.000027
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669351578s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508926392s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] exit Start 0.000028 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040884972s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880439758s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] exit Reset 0.000061 1 0.000111
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959342 15 0.000107
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.970249 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971157 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971188 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.965321 15 0.000228
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.971493 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971661 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040772438s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880477905s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971721 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034724236s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.874450684s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] exit Reset 0.000045 1 0.000075
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.034696579s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.874450684s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339090 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.897997 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898079 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330639 1 0.000025
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339277 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.902705 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330664 1 0.000028
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.902723 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339328 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.903190 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] exit Reset 0.000193 1 0.000133
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.903212 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508941650s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669108391s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508995056s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] exit Start 0.000013 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669086456s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.508987427s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040751457s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880477905s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] exit Reset 0.000055 1 0.000084
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] exit Reset 0.000077 1 0.000074
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] exit Reset 0.000098 1 0.000378
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] exit Start 0.000012 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669052124s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508987427s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669073105s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508995056s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] exit Start 0.000016 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959197 15 0.000122
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669003487s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508941650s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.969834 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971494 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971529 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040740967s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880767822s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 5.330792 1 0.000020
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 5.339419 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 5.898241 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 5.898258 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=43) [1] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.959204 15 0.000141
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.970161 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668955803s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 82.509048462s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.971619 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] exit Reset 0.000039 1 0.000060
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.971664 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] exit Reset 0.000043 1 0.000072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040763855s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active pruub 84.880889893s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040719986s) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880767822s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015009 2 0.000057
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] exit Start 0.000044 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.668935776s) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.509048462s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] exit Reset 0.000049 1 0.000074
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] enter Started
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] enter Start
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] exit Start 0.000012 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014392 2 0.000028
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46 pruub=13.040732384s) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 84.880889893s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012731 2 0.000073
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014022 2 0.000033
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012398 2 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014881 2 0.000053
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014371 2 0.000025
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011153 2 0.000024
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012262 2 0.000021
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010870 2 0.000018
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012012 2 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009982 2 0.000018
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009896 2 0.000018
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000017 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008784 2 0.000038
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008223 2 0.000057
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007847 2 0.000025
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010783 2 0.000020
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007707 2 0.000025
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010714 2 0.000019
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000098 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014230 2 0.000022
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014021 2 0.000022
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014260 2 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013855 2 0.000019
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013658 2 0.000020
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000283 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013183 2 0.000017
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013962 2 0.000019
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013746 2 0.000020
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013384 2 0.000019
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013155 2 0.000020
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012304 2 0.000030
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012126 2 0.000032
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011098 2 0.000035
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000018 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009258 2 0.000040
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016495 2 0.000091
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013586 2 0.000061
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.012867 2 0.000043
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.017104 2 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.017538 2 0.000026
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] exit Start 0.013001 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46 pruub=10.669313431s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 82.508926392s@ mbc={}] enter Started/Stray
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.024493 2 0.000019
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: unregistering
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:57.796804+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 13 sent 11 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:27.585192+0000 osd.1 (osd.1) 12 : cluster [DBG] 3.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:27.599372+0000 osd.1 (osd.1) 13 : cluster [DBG] 3.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57532416 unmapped: 1130496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 46 handle_osd_map epochs [46,47], i have 46, src has [1,47]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 46 handle_osd_map epochs [46,47], i have 47, src has [1,47]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.070264 2 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.087918 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.074607 2 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.074779 2 0.000064
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088533 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.079415 2 0.000033
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088787 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088019 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082553 2 0.000192
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.090501 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082692 2 0.000130
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.090722 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.079742 2 0.000041
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.090942 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082972 2 0.000043
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.091391 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083023 2 0.000032
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.091966 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.071673 2 0.000045
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.088892 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080154 2 0.000088
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.092600 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080032 2 0.000226
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.092473 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083345 2 0.000047
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093309 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083405 2 0.000056
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093477 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080412 2 0.000117
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093653 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080525 2 0.000327
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080687 2 0.000021
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093921 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.093981 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083495 2 0.000119
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094320 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083625 2 0.000053
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094474 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080803 2 0.000378
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094653 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.083918 2 0.000038
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094851 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.080940 2 0.000393
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.094970 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081468 2 0.000023
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095188 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084151 2 0.000036
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095386 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081640 2 0.000024
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095587 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.071260 2 0.000044
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081870 2 0.000036
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096232 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.095865 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084357 2 0.000034
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096436 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.082350 2 0.000045
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096669 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084573 2 0.000036
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096937 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084719 2 0.000054
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.097331 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084784 2 0.000071
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.097686 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081413 2 0.000029
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.098126 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086222 2 0.000668
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.098636 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.081406 2 0.000529
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.096467 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084893 2 0.000030
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099135 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.084913 2 0.000036
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099411 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086486 2 0.000032
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099929 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.085118 2 0.000108
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.085236 2 0.000033
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100194 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.099733 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.085368 2 0.000042
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100497 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086695 2 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.089745 2 0.000037
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100720 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.100957 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086880 2 0.000051
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.101305 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.086923 2 0.000174
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.101788 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.087048 2 0.000100
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.102453 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 13) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:27.585192+0000 osd.1 (osd.1) 12 : cluster [DBG] 3.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:27.599372+0000 osd.1 (osd.1) 13 : cluster [DBG] 3.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008505 4 0.000175
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008404 4 0.000117
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.10( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009155 4 0.000128
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008966 4 0.000063
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000028 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008922 4 0.000055
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.12( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008989 4 0.000083
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008995 4 0.000525
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008882 4 0.000066
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008878 4 0.000106
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017619 4 0.000097
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017680 4 0.000120
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017891 4 0.000069
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017855 4 0.000070
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017893 4 0.000046
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018147 4 0.000277
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018366 4 0.000054
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018436 4 0.000196
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018391 4 0.000092
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018429 4 0.000051
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000084 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018413 4 0.000047
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018363 4 0.000036
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018378 4 0.000056
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018303 4 0.000058
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.7( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018263 4 0.000067
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018230 4 0.000092
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018093 4 0.000056
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018037 4 0.000049
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018149 4 0.000148
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017985 4 0.000185
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017939 4 0.000058
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017909 4 0.000063
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017906 4 0.000056
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017840 4 0.000501
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017847 4 0.000066
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000014 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.4( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [1] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017869 4 0.000511
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017800 4 0.000164
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017795 4 0.000080
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.f( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017707 4 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017761 4 0.000069
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017697 4 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017723 4 0.000083
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017671 4 0.000054
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017598 4 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017516 4 0.000068
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000015 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017918 4 0.000281
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=41/25 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=43/29 lis/c=46/43 les/c/f=47/44/0 sis=46) [1] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017917 4 0.000341
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [1] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.113420 7 0.000043
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.113356 7 0.000060
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.113800 7 0.000062
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000088 1 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000178 1 0.000051
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.114318 7 0.000080
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000468 1 0.000021
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000327 1 0.000071
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.118674 7 0.000045
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.114637 7 0.000103
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.114641 7 0.000079
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000125 1 0.000047
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000166 1 0.000016
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.005130 1 0.005049
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124248 7 0.000078
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124554 7 0.000111
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000077 1 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000107 1 0.000040
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124956 7 0.000087
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.125468 7 0.000066
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124210 7 0.000120
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123791 7 0.000065
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123704 7 0.000083
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123920 7 0.000272
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.124155 7 0.000083
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123598 7 0.000077
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123556 7 0.000074
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123251 7 0.000069
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.110379 7 0.013105
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.125429 7 0.000073
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.123108 7 0.000081
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000361 1 0.000202
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.122937 7 0.000130
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000376 1 0.000125
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000662 1 0.000231
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000768 1 0.000014
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001154 1 0.000028
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001254 1 0.000094
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001368 1 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001476 1 0.000026
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001548 1 0.000031
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002443 1 0.000119
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002537 1 0.000292
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002627 1 0.000081
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002753 1 0.000146
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.002723 1 0.000120
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130748 7 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130511 7 0.000069
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130263 7 0.000061
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127536 7 0.000069
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127486 7 0.000084
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127987 7 0.000075
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.128565 7 0.000067
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129102 7 0.000055
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127676 7 0.000146
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129646 7 0.000068
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.128672 7 0.000135
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.128033 7 0.000436
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127411 7 0.000252
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.130213 7 0.000086
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127365 7 0.000105
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129504 7 0.000066
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.127418 7 0.000072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000422 1 0.000037
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000522 1 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.129684 7 0.000063
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000788 1 0.000038
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000828 1 0.000016
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000938 1 0.000013
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001012 1 0.000014
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000961 1 0.000109
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001007 1 0.000020
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001042 1 0.000023
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001094 1 0.000186
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001141 1 0.000024
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001165 1 0.000019
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001203 1 0.000018
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001309 1 0.000025
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001376 1 0.000024
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001293 1 0.000478
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001525 1 0.000240
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001384 1 0.000482
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.021316 1 0.000060
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.021454 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.134917 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.021510 1 0.000134
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.021721 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.15( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.135117 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.027524 1 0.000078
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.028039 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.11( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.141868 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.15] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.11] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.034883 1 0.000090
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.035246 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.16( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.149597 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.038378 1 0.000046
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.038561 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.17( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.157272 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.045626 1 0.000022
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.045866 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1b( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.160570 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1b] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.047894 1 0.000119
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.053103 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.167786 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.055158 1 0.000058
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.055282 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.179571 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.062318 1 0.001171
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.062469 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.187077 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.067269 1 0.000211
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.067682 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.8( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.192698 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.8] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.074391 1 0.000042
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.075107 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.200637 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.081736 1 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.082157 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.5( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.206530 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.5] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.088612 1 0.000054
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.089437 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.2( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.213283 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.2] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.095651 1 0.000061
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.096879 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.220540 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.103684 1 0.000055
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.105003 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.229260 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.110314 1 0.000127
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.111749 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.7( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.235527 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.117562 1 0.000044
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.119111 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.e( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.242569 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.e] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.124828 1 0.000141
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.126513 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1d( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.249820 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.131583 1 0.000086
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.134087 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.258131 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.138570 1 0.000056
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.141168 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.266869 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.a] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.145940 1 0.000083
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.148599 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1a( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.271749 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1a] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.153311 1 0.000103
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.156105 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.c( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.279691 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.c] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.160571 1 0.000052
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.163406 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [2] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.286415 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.166513 1 0.000135
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.166969 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.13( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.297749 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.13] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.173744 1 0.000147
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.174316 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.304892 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.180823 1 0.000053
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.181660 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.12( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.311995 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.188294 1 0.000097
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.189204 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.316778 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.195677 1 0.000045
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.196672 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.324206 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.202718 1 0.000033
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.203764 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.331781 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.210207 1 0.000054
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.211228 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.339924 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.217437 1 0.000051
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.218482 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.348164 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.224968 1 0.000039
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.226045 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.354762 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.232345 1 0.000033
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.233491 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.6( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.362776 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.6] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.239801 1 0.000034
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.240988 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.4( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.369333 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.4] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.246939 1 0.000030
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.248139 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.375591 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.254142 1 0.000059
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.255384 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[3.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=-1 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.385620 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.261572 1 0.000106
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.262919 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.1f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.390330 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.269302 1 0.000037
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.270734 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.f( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.400265 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.f] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.276355 1 0.000092
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.277710 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.3( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.405904 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.3] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.283604 1 0.000051
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.285182 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.18( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.412663 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.18] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.291106 1 0.000039
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.292587 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 pg_epoch: 47 pg[7.9( empty lb MIN local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.422772 0 0.000000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 scrub-queue::remove_from_osd_queue removing pg[7.9] failed. State was: not registered w/ OSD
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:58.796995+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 1843200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e6000/0x0/0x4ffc00000, data 0x9dc1e/0xe6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e6000/0x0/0x4ffc00000, data 0x9dc1e/0xe6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:59.797166+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57901056 unmapped: 1810432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 352071 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:00.797353+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 1794048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e6000/0x0/0x4ffc00000, data 0x9dc1e/0xe6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:01.797473+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 57974784 unmapped: 1736704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:02.797787+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:03.797998+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e5000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.d deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:04.798113+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 15 sent 13 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:34.620021+0000 osd.1 (osd.1) 14 : cluster [DBG] 3.d deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:34.634158+0000 osd.1 (osd.1) 15 : cluster [DBG] 3.d deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 15) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:34.620021+0000 osd.1 (osd.1) 14 : cluster [DBG] 3.d deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:34.634158+0000 osd.1 (osd.1) 15 : cluster [DBG] 3.d deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361942 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:05.798756+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 17 sent 15 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:35.571104+0000 osd.1 (osd.1) 16 : cluster [DBG] 3.10 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:35.585196+0000 osd.1 (osd.1) 17 : cluster [DBG] 3.10 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 17) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:35.571104+0000 osd.1 (osd.1) 16 : cluster [DBG] 3.10 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:35.585196+0000 osd.1 (osd.1) 17 : cluster [DBG] 3.10 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:06.799015+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:07.799144+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:08.799316+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.520864487s of 12.925769806s, submitted: 415
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:09.799649+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 19 sent 17 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:39.555415+0000 osd.1 (osd.1) 18 : cluster [DBG] 3.13 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:39.569181+0000 osd.1 (osd.1) 19 : cluster [DBG] 3.13 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 19) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:39.555415+0000 osd.1 (osd.1) 18 : cluster [DBG] 3.13 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:39.569181+0000 osd.1 (osd.1) 19 : cluster [DBG] 3.13 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:10.799845+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 363090 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:11.799973+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:12.800101+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:13.800237+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 21 sent 19 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:43.573238+0000 osd.1 (osd.1) 20 : cluster [DBG] 3.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:43.587279+0000 osd.1 (osd.1) 21 : cluster [DBG] 3.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 21) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:43.573238+0000 osd.1 (osd.1) 20 : cluster [DBG] 3.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:43.587279+0000 osd.1 (osd.1) 21 : cluster [DBG] 3.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:14.800446+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 1654784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:15.800579+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 364238 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 1646592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:16.800736+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 1646592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:17.800867+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 1679360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:18.801007+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.918129921s of 10.043713570s, submitted: 6
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:19.801255+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 23 sent 21 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:49.480735+0000 osd.1 (osd.1) 22 : cluster [DBG] 3.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:49.494809+0000 osd.1 (osd.1) 23 : cluster [DBG] 3.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 1671168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 23) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:49.480735+0000 osd.1 (osd.1) 22 : cluster [DBG] 3.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:49.494809+0000 osd.1 (osd.1) 23 : cluster [DBG] 3.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:20.801501+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 365386 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:21.801642+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:22.801784+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 1638400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:23.801940+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 1630208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:24.802079+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 25 sent 23 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:54.480484+0000 osd.1 (osd.1) 24 : cluster [DBG] 3.1a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:54.494637+0000 osd.1 (osd.1) 25 : cluster [DBG] 3.1a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 1638400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 25) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:54.480484+0000 osd.1 (osd.1) 24 : cluster [DBG] 3.1a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:54.494637+0000 osd.1 (osd.1) 25 : cluster [DBG] 3.1a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:25.802322+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 27 sent 25 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:55.461337+0000 osd.1 (osd.1) 26 : cluster [DBG] 3.1c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:55.475407+0000 osd.1 (osd.1) 27 : cluster [DBG] 3.1c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367682 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 1638400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 27) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:55.461337+0000 osd.1 (osd.1) 26 : cluster [DBG] 3.1c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:55.475407+0000 osd.1 (osd.1) 27 : cluster [DBG] 3.1c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:26.802531+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 1630208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:27.802655+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 29 sent 27 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:57.531156+0000 osd.1 (osd.1) 28 : cluster [DBG] 7.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:15:57.545535+0000 osd.1 (osd.1) 29 : cluster [DBG] 7.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58097664 unmapped: 1613824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 29) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:57.531156+0000 osd.1 (osd.1) 28 : cluster [DBG] 7.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:15:57.545535+0000 osd.1 (osd.1) 29 : cluster [DBG] 7.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:28.802901+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 1605632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:29.803112+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 1605632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:30.803240+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368829 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 1597440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.829093933s of 11.850773811s, submitted: 6
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:31.803382+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 31 sent 29 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:01.449620+0000 osd.1 (osd.1) 30 : cluster [DBG] 7.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:01.463675+0000 osd.1 (osd.1) 31 : cluster [DBG] 7.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58163200 unmapped: 1548288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 31) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:01.449620+0000 osd.1 (osd.1) 30 : cluster [DBG] 7.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:01.463675+0000 osd.1 (osd.1) 31 : cluster [DBG] 7.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:32.803683+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:33.803816+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:34.804004+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:35.804174+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 369976 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:36.804338+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 33 sent 31 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:06.428229+0000 osd.1 (osd.1) 32 : cluster [DBG] 7.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:06.442376+0000 osd.1 (osd.1) 33 : cluster [DBG] 7.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 33) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:06.428229+0000 osd.1 (osd.1) 32 : cluster [DBG] 7.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:06.442376+0000 osd.1 (osd.1) 33 : cluster [DBG] 7.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:37.804516+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:38.804817+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 1515520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.10 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:39.805185+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 35 sent 33 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:09.379474+0000 osd.1 (osd.1) 34 : cluster [DBG] 7.10 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:09.393481+0000 osd.1 (osd.1) 35 : cluster [DBG] 7.10 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 1507328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 35) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:09.379474+0000 osd.1 (osd.1) 34 : cluster [DBG] 7.10 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:09.393481+0000 osd.1 (osd.1) 35 : cluster [DBG] 7.10 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:40.805425+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 372271 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 1499136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:41.805583+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 1499136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:42.805719+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 1490944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:43.805921+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:44.806071+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.794986725s of 13.843894005s, submitted: 6
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.12 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:45.806215+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 37 sent 35 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:15.293763+0000 osd.1 (osd.1) 36 : cluster [DBG] 7.12 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:15.307763+0000 osd.1 (osd.1) 37 : cluster [DBG] 7.12 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373419 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 37) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:15.293763+0000 osd.1 (osd.1) 36 : cluster [DBG] 7.12 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:15.307763+0000 osd.1 (osd.1) 37 : cluster [DBG] 7.12 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:46.806495+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:47.806659+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 1458176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:48.806829+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 1458176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:49.807021+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 39 sent 37 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:19.188974+0000 osd.1 (osd.1) 38 : cluster [DBG] 7.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:19.202871+0000 osd.1 (osd.1) 39 : cluster [DBG] 7.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 1458176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 39) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:19.188974+0000 osd.1 (osd.1) 38 : cluster [DBG] 7.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:19.202871+0000 osd.1 (osd.1) 39 : cluster [DBG] 7.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:50.807228+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 41 sent 39 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:20.206734+0000 osd.1 (osd.1) 40 : cluster [DBG] 7.16 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:20.220883+0000 osd.1 (osd.1) 41 : cluster [DBG] 7.16 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 375715 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 1441792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 41) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:20.206734+0000 osd.1 (osd.1) 40 : cluster [DBG] 7.16 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:20.220883+0000 osd.1 (osd.1) 41 : cluster [DBG] 7.16 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:51.807693+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 1441792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:52.807823+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:53.807970+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:54.808075+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:55.808325+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 375715 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 1409024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.927809715s of 10.960511208s, submitted: 6
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:56.808541+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 43 sent 41 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:26.254024+0000 osd.1 (osd.1) 42 : cluster [DBG] 7.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:26.268040+0000 osd.1 (osd.1) 43 : cluster [DBG] 7.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 1400832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:57.809008+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 4 last_log 45 sent 43 num 4 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:27.264410+0000 osd.1 (osd.1) 44 : cluster [DBG] 7.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:27.278501+0000 osd.1 (osd.1) 45 : cluster [DBG] 7.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 43) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:26.254024+0000 osd.1 (osd.1) 42 : cluster [DBG] 7.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:26.268040+0000 osd.1 (osd.1) 43 : cluster [DBG] 7.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:58.809192+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 45) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:27.264410+0000 osd.1 (osd.1) 44 : cluster [DBG] 7.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:27.278501+0000 osd.1 (osd.1) 45 : cluster [DBG] 7.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:59.809489+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 47 sent 45 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:29.313883+0000 osd.1 (osd.1) 46 : cluster [DBG] 7.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:29.327926+0000 osd.1 (osd.1) 47 : cluster [DBG] 7.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 1376256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 47) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:29.313883+0000 osd.1 (osd.1) 46 : cluster [DBG] 7.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:29.327926+0000 osd.1 (osd.1) 47 : cluster [DBG] 7.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:00.809690+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 49 sent 47 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:30.280448+0000 osd.1 (osd.1) 48 : cluster [DBG] 7.1e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:30.294515+0000 osd.1 (osd.1) 49 : cluster [DBG] 7.1e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 380307 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58351616 unmapped: 1359872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 49) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:30.280448+0000 osd.1 (osd.1) 48 : cluster [DBG] 7.1e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:30.294515+0000 osd.1 (osd.1) 49 : cluster [DBG] 7.1e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:01.809972+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58351616 unmapped: 1359872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:02.810412+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 1351680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:03.810599+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 51 sent 49 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:33.213917+0000 osd.1 (osd.1) 50 : cluster [DBG] 6.1e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:33.228076+0000 osd.1 (osd.1) 51 : cluster [DBG] 6.1e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 1351680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 51) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:33.213917+0000 osd.1 (osd.1) 50 : cluster [DBG] 6.1e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:33.228076+0000 osd.1 (osd.1) 51 : cluster [DBG] 6.1e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:04.811056+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:05.811241+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 381455 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:06.811393+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.706548691s of 10.881001472s, submitted: 10
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:07.811519+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:37.135186+0000 osd.1 (osd.1) 52 : cluster [DBG] 5.18 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:37.149358+0000 osd.1 (osd.1) 53 : cluster [DBG] 5.18 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 53) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:37.135186+0000 osd.1 (osd.1) 52 : cluster [DBG] 5.18 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:37.149358+0000 osd.1 (osd.1) 53 : cluster [DBG] 5.18 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:08.811735+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:38.109588+0000 osd.1 (osd.1) 54 : cluster [DBG] 5.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:38.123475+0000 osd.1 (osd.1) 55 : cluster [DBG] 5.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 1327104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 55) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:38.109588+0000 osd.1 (osd.1) 54 : cluster [DBG] 5.19 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:38.123475+0000 osd.1 (osd.1) 55 : cluster [DBG] 5.19 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:09.811997+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:10.812103+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383751 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:11.812233+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:12.812353+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:42.069793+0000 osd.1 (osd.1) 56 : cluster [DBG] 5.1a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:42.083790+0000 osd.1 (osd.1) 57 : cluster [DBG] 5.1a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 57) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:42.069793+0000 osd.1 (osd.1) 56 : cluster [DBG] 5.1a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:42.083790+0000 osd.1 (osd.1) 57 : cluster [DBG] 5.1a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:13.812588+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58417152 unmapped: 1294336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:14.812723+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:44.098536+0000 osd.1 (osd.1) 58 : cluster [DBG] 5.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:44.112654+0000 osd.1 (osd.1) 59 : cluster [DBG] 5.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 59) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:44.098536+0000 osd.1 (osd.1) 58 : cluster [DBG] 5.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:44.112654+0000 osd.1 (osd.1) 59 : cluster [DBG] 5.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:15.812922+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:16.813058+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:17.813199+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:18.813336+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:19.813580+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:20.813768+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:21.813927+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.945797920s of 14.976054192s, submitted: 8
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:22.814097+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:52.111194+0000 osd.1 (osd.1) 60 : cluster [DBG] 5.f deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:52.125331+0000 osd.1 (osd.1) 61 : cluster [DBG] 5.f deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 1253376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 61) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:52.111194+0000 osd.1 (osd.1) 60 : cluster [DBG] 5.f deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:52.125331+0000 osd.1 (osd.1) 61 : cluster [DBG] 5.f deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:23.814574+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:24.814703+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:54.106839+0000 osd.1 (osd.1) 62 : cluster [DBG] 2.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:54.120955+0000 osd.1 (osd.1) 63 : cluster [DBG] 2.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 63) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:54.106839+0000 osd.1 (osd.1) 62 : cluster [DBG] 2.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:54.120955+0000 osd.1 (osd.1) 63 : cluster [DBG] 2.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:25.814966+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388341 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:26.815118+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:56.110650+0000 osd.1 (osd.1) 64 : cluster [DBG] 4.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:56.124687+0000 osd.1 (osd.1) 65 : cluster [DBG] 4.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 1228800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 65) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:56.110650+0000 osd.1 (osd.1) 64 : cluster [DBG] 4.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:56.124687+0000 osd.1 (osd.1) 65 : cluster [DBG] 4.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:27.815382+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:57.076603+0000 osd.1 (osd.1) 66 : cluster [DBG] 6.c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:57.090754+0000 osd.1 (osd.1) 67 : cluster [DBG] 6.c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 67) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:57.076603+0000 osd.1 (osd.1) 66 : cluster [DBG] 6.c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:57.090754+0000 osd.1 (osd.1) 67 : cluster [DBG] 6.c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:28.815575+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:29.815772+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:59.068688+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.6 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:59.082801+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.6 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58499072 unmapped: 1212416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 69) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:59.068688+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.6 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:59.082801+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.6 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:30.815924+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 391782 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:31.816053+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:01.076602+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.c deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:01.090715+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.c deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 71) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:01.076602+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.c deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:01.090715+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.c deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:32.816345+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:02.080714+0000 osd.1 (osd.1) 72 : cluster [DBG] 6.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:02.098382+0000 osd.1 (osd.1) 73 : cluster [DBG] 6.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.828042984s of 10.924718857s, submitted: 14
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 1187840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 73) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:02.080714+0000 osd.1 (osd.1) 72 : cluster [DBG] 6.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:02.098382+0000 osd.1 (osd.1) 73 : cluster [DBG] 6.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:33.816601+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:03.035844+0000 osd.1 (osd.1) 74 : cluster [DBG] 5.1 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:03.049961+0000 osd.1 (osd.1) 75 : cluster [DBG] 5.1 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 1179648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 75) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:03.035844+0000 osd.1 (osd.1) 74 : cluster [DBG] 5.1 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:03.049961+0000 osd.1 (osd.1) 75 : cluster [DBG] 5.1 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:34.816776+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:04.067468+0000 osd.1 (osd.1) 76 : cluster [DBG] 4.f scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:04.081451+0000 osd.1 (osd.1) 77 : cluster [DBG] 4.f scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:35.816966+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 4 last_log 79 sent 77 num 4 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:05.018694+0000 osd.1 (osd.1) 78 : cluster [DBG] 2.5 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:05.032744+0000 osd.1 (osd.1) 79 : cluster [DBG] 2.5 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 77) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:04.067468+0000 osd.1 (osd.1) 76 : cluster [DBG] 4.f scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:04.081451+0000 osd.1 (osd.1) 77 : cluster [DBG] 4.f scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:36.817198+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 4 last_log 81 sent 79 num 4 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:06.053944+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:06.068056+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 79) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:05.018694+0000 osd.1 (osd.1) 78 : cluster [DBG] 2.5 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:05.032744+0000 osd.1 (osd.1) 79 : cluster [DBG] 2.5 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 81) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:06.053944+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:06.068056+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:37.817406+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:38.817549+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:39.817711+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:40.817843+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:41.817981+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:42.818108+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:43.818234+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.857633591s of 11.003772736s, submitted: 8
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:44.818367+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:14.039840+0000 osd.1 (osd.1) 82 : cluster [DBG] 2.3 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:14.053886+0000 osd.1 (osd.1) 83 : cluster [DBG] 2.3 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 83) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:14.039840+0000 osd.1 (osd.1) 82 : cluster [DBG] 2.3 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:14.053886+0000 osd.1 (osd.1) 83 : cluster [DBG] 2.3 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:45.818547+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 399811 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:46.818676+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:47.818807+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:48.818932+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:18.086898+0000 osd.1 (osd.1) 84 : cluster [DBG] 2.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:18.101033+0000 osd.1 (osd.1) 85 : cluster [DBG] 2.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 85) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:18.086898+0000 osd.1 (osd.1) 84 : cluster [DBG] 2.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:18.101033+0000 osd.1 (osd.1) 85 : cluster [DBG] 2.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:49.819162+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:50.819316+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400958 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:51.819466+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:52.819747+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:53.819890+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:23.171732+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:23.185809+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 87) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:23.171732+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:23.185809+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:54.820058+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:55.820440+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 402105 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.033282280s of 12.157509804s, submitted: 6
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:56.820634+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:26.197393+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:26.211663+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 89) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:26.197393+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:26.211663+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:57.820940+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:58.821084+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:59.821321+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:00.821445+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:30.174708+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.15 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:30.188775+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.15 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 404401 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 91) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:30.174708+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.15 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:30.188775+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.15 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:01.821661+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:02.821775+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:32.221876+0000 osd.1 (osd.1) 92 : cluster [DBG] 5.12 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:32.235942+0000 osd.1 (osd.1) 93 : cluster [DBG] 5.12 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 93) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:32.221876+0000 osd.1 (osd.1) 92 : cluster [DBG] 5.12 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:32.235942+0000 osd.1 (osd.1) 93 : cluster [DBG] 5.12 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:03.821923+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:04.822244+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:05.822485+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 405549 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:06.822734+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:07.823238+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857577324s of 11.936425209s, submitted: 6
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:08.823446+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:38.133696+0000 osd.1 (osd.1) 94 : cluster [DBG] 5.13 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:38.147872+0000 osd.1 (osd.1) 95 : cluster [DBG] 5.13 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 95) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:38.133696+0000 osd.1 (osd.1) 94 : cluster [DBG] 5.13 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:38.147872+0000 osd.1 (osd.1) 95 : cluster [DBG] 5.13 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:09.823668+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:10.823848+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:40.178448+0000 osd.1 (osd.1) 96 : cluster [DBG] 2.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:40.192812+0000 osd.1 (osd.1) 97 : cluster [DBG] 2.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 97) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:40.178448+0000 osd.1 (osd.1) 96 : cluster [DBG] 2.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:40.192812+0000 osd.1 (osd.1) 97 : cluster [DBG] 2.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 407845 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:11.824081+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:12.824249+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:13.824410+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:14.824583+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:15.824732+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 408992 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:16.824909+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:46.183927+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:46.198038+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 99) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:46.183927+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:46.198038+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:17.825128+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.989212036s of 10.056298256s, submitted: 6
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:18.825277+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:48.190144+0000 osd.1 (osd.1) 100 : cluster [DBG] 2.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:48.204241+0000 osd.1 (osd.1) 101 : cluster [DBG] 2.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 101) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:48.190144+0000 osd.1 (osd.1) 100 : cluster [DBG] 2.d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:48.204241+0000 osd.1 (osd.1) 101 : cluster [DBG] 2.d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:19.825736+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:20.825898+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411286 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:21.826066+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:51.191890+0000 osd.1 (osd.1) 102 : cluster [DBG] 6.6 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:51.209490+0000 osd.1 (osd.1) 103 : cluster [DBG] 6.6 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 103) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:51.191890+0000 osd.1 (osd.1) 102 : cluster [DBG] 6.6 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:51.209490+0000 osd.1 (osd.1) 103 : cluster [DBG] 6.6 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:22.826342+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:52.230020+0000 osd.1 (osd.1) 104 : cluster [DBG] 4.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:52.244120+0000 osd.1 (osd.1) 105 : cluster [DBG] 4.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 105) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:52.230020+0000 osd.1 (osd.1) 104 : cluster [DBG] 4.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:52.244120+0000 osd.1 (osd.1) 105 : cluster [DBG] 4.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:23.826560+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:53.232235+0000 osd.1 (osd.1) 106 : cluster [DBG] 6.1 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:53.246302+0000 osd.1 (osd.1) 107 : cluster [DBG] 6.1 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 107) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:53.232235+0000 osd.1 (osd.1) 106 : cluster [DBG] 6.1 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:53.246302+0000 osd.1 (osd.1) 107 : cluster [DBG] 6.1 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:24.826756+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:25.826919+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 414727 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:26.827096+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:56.196278+0000 osd.1 (osd.1) 108 : cluster [DBG] 4.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:56.210230+0000 osd.1 (osd.1) 109 : cluster [DBG] 4.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 109) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:56.196278+0000 osd.1 (osd.1) 108 : cluster [DBG] 4.7 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:56.210230+0000 osd.1 (osd.1) 109 : cluster [DBG] 4.7 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:27.827355+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.990326881s of 10.023887634s, submitted: 10
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:28.827502+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:58.213887+0000 osd.1 (osd.1) 110 : cluster [DBG] 6.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:58.231581+0000 osd.1 (osd.1) 111 : cluster [DBG] 6.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 111) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:58.213887+0000 osd.1 (osd.1) 110 : cluster [DBG] 6.b scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:58.231581+0000 osd.1 (osd.1) 111 : cluster [DBG] 6.b scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:29.828205+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:59.204601+0000 osd.1 (osd.1) 112 : cluster [DBG] 4.5 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:59.218668+0000 osd.1 (osd.1) 113 : cluster [DBG] 4.5 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 113) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:59.204601+0000 osd.1 (osd.1) 112 : cluster [DBG] 4.5 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:59.218668+0000 osd.1 (osd.1) 113 : cluster [DBG] 4.5 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:30.828380+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 417021 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:31.830649+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:32.830796+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 115 sent 113 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:02.187324+0000 osd.1 (osd.1) 114 : cluster [DBG] 6.e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:02.204891+0000 osd.1 (osd.1) 115 : cluster [DBG] 6.e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 115) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:02.187324+0000 osd.1 (osd.1) 114 : cluster [DBG] 6.e scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:02.204891+0000 osd.1 (osd.1) 115 : cluster [DBG] 6.e scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:33.831135+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:34.831341+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:35.831779+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 117 sent 115 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:05.177850+0000 osd.1 (osd.1) 116 : cluster [DBG] 4.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:05.191965+0000 osd.1 (osd.1) 117 : cluster [DBG] 4.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 420462 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 117) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:05.177850+0000 osd.1 (osd.1) 116 : cluster [DBG] 4.9 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:05.191965+0000 osd.1 (osd.1) 117 : cluster [DBG] 4.9 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:36.832180+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 119 sent 117 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:06.158642+0000 osd.1 (osd.1) 118 : cluster [DBG] 4.8 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:06.172862+0000 osd.1 (osd.1) 119 : cluster [DBG] 4.8 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 119) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:06.158642+0000 osd.1 (osd.1) 118 : cluster [DBG] 4.8 deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:06.172862+0000 osd.1 (osd.1) 119 : cluster [DBG] 4.8 deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:37.832459+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 121 sent 119 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:07.208220+0000 osd.1 (osd.1) 120 : cluster [DBG] 6.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:07.222392+0000 osd.1 (osd.1) 121 : cluster [DBG] 6.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.916434288s of 10.038483620s, submitted: 12
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 121) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:07.208220+0000 osd.1 (osd.1) 120 : cluster [DBG] 6.17 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:07.222392+0000 osd.1 (osd.1) 121 : cluster [DBG] 6.17 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:38.833133+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 123 sent 121 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:08.252604+0000 osd.1 (osd.1) 122 : cluster [DBG] 4.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:08.266700+0000 osd.1 (osd.1) 123 : cluster [DBG] 4.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 123) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:08.252604+0000 osd.1 (osd.1) 122 : cluster [DBG] 4.14 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:08.266700+0000 osd.1 (osd.1) 123 : cluster [DBG] 4.14 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:39.833360+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:40.833810+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 423906 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:41.834070+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 125 sent 123 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:11.240823+0000 osd.1 (osd.1) 124 : cluster [DBG] 4.12 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:11.254856+0000 osd.1 (osd.1) 125 : cluster [DBG] 4.12 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 125) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:11.240823+0000 osd.1 (osd.1) 124 : cluster [DBG] 4.12 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:11.254856+0000 osd.1 (osd.1) 125 : cluster [DBG] 4.12 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:42.834259+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 127 sent 125 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:12.226843+0000 osd.1 (osd.1) 126 : cluster [DBG] 4.10 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:12.241115+0000 osd.1 (osd.1) 127 : cluster [DBG] 4.10 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 127) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:12.226843+0000 osd.1 (osd.1) 126 : cluster [DBG] 4.10 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:12.241115+0000 osd.1 (osd.1) 127 : cluster [DBG] 4.10 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:43.834820+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 129 sent 127 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:13.235284+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.2 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:13.249338+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.2 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 129) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:13.235284+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.2 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:13.249338+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.2 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:44.835370+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:45.835966+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 427349 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:46.836144+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 131 sent 129 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:16.283606+0000 osd.1 (osd.1) 130 : cluster [DBG] 5.11 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:16.297593+0000 osd.1 (osd.1) 131 : cluster [DBG] 5.11 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 131) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:16.283606+0000 osd.1 (osd.1) 130 : cluster [DBG] 5.11 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:16.297593+0000 osd.1 (osd.1) 131 : cluster [DBG] 5.11 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:47.836767+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.910740852s of 10.036123276s, submitted: 10
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:48.837174+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 133 sent 131 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:18.288934+0000 osd.1 (osd.1) 132 : cluster [DBG] 2.1b deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:18.302868+0000 osd.1 (osd.1) 133 : cluster [DBG] 2.1b deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 133) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:18.288934+0000 osd.1 (osd.1) 132 : cluster [DBG] 2.1b deep-scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:18.302868+0000 osd.1 (osd.1) 133 : cluster [DBG] 2.1b deep-scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:49.837905+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:50.838078+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 428497 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:51.838415+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:52.838741+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:53.838886+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:54.839078+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 135 sent 133 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:24.241540+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:24.259223+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 135) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:24.241540+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.1d scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:24.259223+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.1d scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:55.839351+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 430793 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:56.839641+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 137 sent 135 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:26.311663+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.1c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:26.329368+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.1c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 137) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:26.311663+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.1c scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:26.329368+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.1c scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:57.839881+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 139 sent 137 num 2 unsent 2 sending 2
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:27.274463+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:27.288558+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 139) v1
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:27.274463+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.4 scrub starts
Dec 01 09:39:32 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:27.288558+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.4 scrub ok
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:58.840093+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:59.840473+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:00.840714+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:01.840921+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:02.841093+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:03.841390+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:04.841633+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:05.841887+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:06.842161+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:07.842378+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:08.842555+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:09.842799+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:10.842987+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:11.843249+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:12.843440+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:13.843610+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:14.843762+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:15.843997+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:16.844207+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:17.844479+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:18.844760+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:19.845053+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:20.845184+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:21.845387+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:22.845573+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:23.845694+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:24.845904+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:25.846060+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:26.846273+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:27.846461+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:28.846650+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:29.846935+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:30.847113+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:31.847253+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:32.847567+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:33.847713+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:34.847911+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:35.848074+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:36.848257+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:37.848425+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:38.848620+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:39.848851+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:40.849037+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:41.849244+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:42.849433+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:43.849582+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:44.849789+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:45.849946+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:46.850094+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:47.850341+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:48.850474+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:49.850659+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:50.850808+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:51.850992+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:52.851165+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:53.851345+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:54.851508+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:55.851705+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:56.851967+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:57.852201+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:58.852383+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:59.852702+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:00.852877+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:01.853535+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:02.853799+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:03.853941+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:04.854093+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:05.854269+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:06.854570+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:07.854793+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:08.855049+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:09.855382+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:10.855532+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:11.855830+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:12.856101+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:13.856404+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:14.856583+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:15.856746+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:16.856922+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:17.857098+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:18.857278+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:19.857626+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:20.857820+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:21.857974+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:22.858216+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:23.858362+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:24.858530+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:25.858722+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:26.858870+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:27.859082+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:28.859308+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:29.859983+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:30.860139+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:31.860397+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:32.860589+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:33.860764+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:34.860939+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:35.861198+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:36.861368+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:37.861480+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:38.861608+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:39.861841+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:40.861993+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:41.862124+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:42.862314+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:43.862484+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:44.862613+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:45.862752+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:46.863025+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:47.863186+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:48.863422+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:49.864330+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:50.864568+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:51.864771+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:52.865028+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:53.865160+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:54.865359+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:55.865639+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:56.865824+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:57.866083+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:58.866429+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:59.866656+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:00.866873+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:01.867050+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:02.867203+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:03.867367+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:04.867488+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:05.867619+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:06.867744+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:07.867850+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:08.867994+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:09.868205+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:10.868358+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:11.868497+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:12.868636+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:13.868759+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:14.868874+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:15.869016+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:16.869151+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:17.869319+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:18.869426+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:19.869607+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:20.869736+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:21.869872+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:22.869999+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:23.870138+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:24.870278+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:25.870445+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:26.870573+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:27.870723+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:28.870874+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:29.871043+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:30.871179+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:31.871391+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:32.871579+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:33.871909+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:34.872024+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:35.872136+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:36.872278+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:37.872462+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:38.872591+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:39.872831+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:40.872970+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:41.873094+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:42.873221+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:43.873445+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:44.873639+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:45.873767+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:46.873986+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:47.874163+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:48.874379+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:49.874602+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:50.874784+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:51.874952+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:52.875103+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:53.875249+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:54.875364+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:55.875564+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:56.875735+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:57.875918+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:58.876069+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:59.876277+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:00.876537+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:01.876698+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:02.876871+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:03.877032+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:04.877213+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:05.877371+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:06.877549+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:07.877876+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:08.878064+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:09.878321+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:10.878474+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:11.878669+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:12.878860+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:13.879010+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:14.879252+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:15.879493+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:16.879654+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:17.879818+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:18.880206+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:19.881188+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:20.881762+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:21.882273+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:22.882497+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:23.882758+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:24.882921+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:25.883397+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:26.883570+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:27.883791+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:28.884105+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:29.884502+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:30.884744+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:31.885049+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:32.885342+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:33.885634+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:34.885940+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:35.886126+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:36.886337+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:37.886547+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:38.886927+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:39.887273+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:40.887478+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:41.887623+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:42.887814+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:43.887999+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:44.888178+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:45.888410+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:46.888551+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:47.888762+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:48.888966+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:49.889233+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:50.889357+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:51.889529+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:52.889661+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:53.889845+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:54.890276+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:55.890475+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:56.890623+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:57.890994+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:58.891169+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:59.891906+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:00.892101+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:01.892543+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:02.892745+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:03.892958+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:04.893389+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:05.893576+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:06.893819+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:07.893989+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:08.894141+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:09.894385+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:10.894606+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:11.894755+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:12.894949+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:13.895143+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:14.895327+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:15.895475+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:16.895615+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:17.895737+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:18.895927+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:19.896375+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:20.896559+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:21.896727+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:22.896896+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:23.897043+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:24.897237+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:25.897382+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:26.897543+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:27.897680+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:28.897798+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:29.897993+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:30.898158+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:31.898316+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:32.898449+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:33.898603+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:34.898769+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:35.898896+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:36.899045+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:37.899264+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:38.899492+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:39.899900+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:40.900043+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:41.900191+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:42.900335+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:43.900493+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:44.900640+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:45.900789+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:46.900990+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:47.901140+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:48.901285+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:49.901551+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:50.901755+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:51.901892+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:52.902050+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:53.902199+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:54.902413+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:55.902571+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:56.902675+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:58.028139+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:59.028953+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:00.029243+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:01.029394+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:02.029568+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:03.029728+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:04.029889+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:05.030062+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:06.030234+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:07.030433+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:08.030582+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:09.030751+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:10.030899+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:11.031021+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 podman[261541]: 2025-12-01 09:39:32.756955801 +0000 UTC m=+0.055629512 container create d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_yalow, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:12.031150+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:13.031307+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:14.031446+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:15.031570+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:16.031747+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:17.031927+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:18.032118+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:19.032337+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:20.032517+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:21.032665+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:22.032802+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:23.032942+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:24.033171+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:25.033402+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:26.033666+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:27.034433+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:28.034576+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:29.034741+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:30.035128+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:31.035367+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:32.035515+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:33.035761+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:34.035929+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:35.036365+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:36.036491+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:37.036611+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:38.036728+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:39.036892+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:40.037102+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:41.037430+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:42.037678+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:43.038021+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:44.038163+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:45.038347+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 16.04 MB, 0.03 MB/s
                                           Interval WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:46.038531+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:47.038655+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:48.038800+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:49.038960+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:50.039181+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:51.039464+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:52.039677+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:53.039914+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:54.040089+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:55.040324+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:56.040470+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:57.040609+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:58.040780+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:59.040874+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:00.041195+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:01.041365+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:02.041530+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:03.041687+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:04.041841+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:05.041978+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:06.042111+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:07.042262+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:08.042418+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:09.042606+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:10.042803+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:11.042910+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:12.043050+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:13.043221+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:14.043352+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:15.043511+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:16.043643+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:17.044022+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:18.044233+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:19.045350+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:20.045535+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:21.045667+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:22.045801+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:23.045962+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:24.046104+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:25.046207+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:26.046563+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:27.046764+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:28.046904+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:29.047222+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:30.047458+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:31.047607+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:32.047823+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:33.047973+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:34.048167+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:35.048343+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:36.048494+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:37.048617+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:38.048764+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:39.048917+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:40.049090+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:41.049230+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:42.049442+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:43.049603+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:44.049737+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:45.049905+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:46.050175+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:47.050391+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:48.050572+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:49.050698+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:50.050900+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:51.051045+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:52.051220+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:53.051375+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:54.051595+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:55.051792+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:56.052160+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:57.052380+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:58.052562+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:59.052747+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:00.053199+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:01.053365+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:02.053544+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:03.053731+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:04.053890+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:05.054081+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:06.054224+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:07.054364+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:08.054530+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:09.054683+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:10.054919+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:11.055047+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:12.055193+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:13.055397+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:14.055538+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:15.055693+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:16.055833+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:17.056054+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:18.056184+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:19.056469+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:20.056664+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:21.056826+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:22.056960+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:23.057096+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:24.057241+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:25.057376+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:26.057558+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:27.057717+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:28.057957+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:29.058151+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:30.058344+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:31.058516+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:32.058686+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:33.058849+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:34.059014+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:35.059247+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:36.059465+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:37.059681+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:38.059875+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:39.059995+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:40.060142+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:41.060276+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:42.060427+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:43.060555+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:44.060726+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:45.060903+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:46.061055+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:47.061188+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:48.061347+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:49.061527+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:50.061812+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:51.061980+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:52.062113+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:53.062252+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:54.062382+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:55.062612+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:56.062756+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:57.062892+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:58.063035+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:59.063197+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:00.063419+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:01.063597+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:02.063721+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:03.063886+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:04.064079+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:05.064214+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:06.064364+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:07.064489+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:08.064603+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:09.064727+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:10.065076+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:11.065243+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:12.065357+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:13.065481+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:14.065592+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:15.065721+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:16.065872+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:17.066026+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:18.066176+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:19.066323+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:20.066463+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:21.066590+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:22.066708+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:23.066838+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:24.066966+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:25.067104+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:26.067242+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:27.067437+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:28.067600+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:29.067765+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:30.068025+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:31.068204+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:32.068459+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:33.068624+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:34.068747+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:35.068925+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:36.069076+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:37.069501+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:38.069851+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:39.070004+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:40.070240+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:41.070434+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:42.070622+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:43.070766+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:44.070954+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:45.071134+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:46.071302+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:47.071445+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:48.071575+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:49.071739+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:50.072090+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:51.072316+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:52.072474+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:53.072619+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:54.072740+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:55.072857+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:56.072968+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:57.073128+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:58.073343+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:59.073473+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:00.073677+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:01.073809+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:02.074002+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:03.074259+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:04.074443+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:05.074615+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:06.074924+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:07.075066+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:08.075202+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:09.075352+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:10.075647+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:11.075792+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:12.075940+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:13.076117+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:14.076366+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:15.076540+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:16.076729+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:17.076915+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:18.077084+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:19.077317+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:20.077479+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:21.077644+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:22.077863+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:23.078059+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:24.078183+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:25.078375+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:26.078549+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:27.078676+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:28.078849+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:29.078999+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:30.079211+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:31.079357+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:32.079502+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:33.080000+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:34.080153+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:35.080368+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:36.080504+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:37.080692+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:38.080895+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:39.081076+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:40.081261+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:41.081430+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:42.081571+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:43.081844+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:44.082009+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:45.082255+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:46.082399+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:47.082571+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:48.082745+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:49.082975+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:50.083222+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:51.083439+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:52.083585+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:53.083776+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:54.084151+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:55.084500+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:56.084641+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:57.084765+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:58.084958+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:59.085114+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:00.085320+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:01.085471+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:02.085617+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:03.085797+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:04.085971+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:05.086141+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:06.086321+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:07.086498+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:08.086700+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:09.086897+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:10.087065+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:11.087232+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:12.087357+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:13.087560+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:14.087750+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:15.087902+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:16.088202+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:17.088418+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:18.088584+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:19.088725+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:20.088918+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:21.089056+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:22.089206+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:23.089402+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:24.089547+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:25.089675+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:26.089831+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:27.089984+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:28.090115+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:29.090275+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:30.090530+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:31.090674+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:32.090841+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:33.090999+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:34.091198+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:35.091491+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:36.091787+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:37.092044+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:38.092271+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:39.092465+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:40.092679+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:41.092815+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:42.092954+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:43.093138+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:44.093339+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:45.093465+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:46.093613+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:47.093753+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:48.093881+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:49.094012+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:50.094721+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:51.095252+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:52.095515+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:53.095652+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:54.097469+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:55.097837+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:56.098087+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:57.098261+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:58.098542+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b271c00 session 0x555f19f23860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b3fe400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b3fe800 session 0x555f1b28fc20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b271c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:59.098758+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:00.098938+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:01.099110+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:02.099285+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:03.099464+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:04.099622+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:05.099793+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:06.100014+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:07.100226+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:08.100422+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:09.100613+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:10.100891+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:11.101085+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:12.101233+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:13.101397+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:14.101579+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:15.101713+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:16.101850+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:17.102011+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:18.102135+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:19.102323+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:20.102556+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:21.102719+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:22.102894+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:23.103204+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:24.103347+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:25.104077+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:26.104222+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:27.104362+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:28.104487+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:29.104639+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:30.104789+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:31.104928+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:32.105062+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:33.105198+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:34.105341+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:35.105463+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:36.105639+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:37.105815+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:38.106058+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:39.106359+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:40.106526+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:41.106660+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:42.106846+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:43.106979+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:44.107213+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:45.107348+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:46.107551+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:47.107764+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:48.107953+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:49.108090+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:50.108249+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:51.108409+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:52.108556+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:53.108703+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:54.109024+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:55.109217+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:56.109373+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:57.109532+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:58.109730+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:59.109871+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:00.110410+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:01.110565+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:02.110761+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:03.110910+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:04.111236+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:05.111374+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:06.111708+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:07.111915+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:08.112208+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:09.112424+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:10.114394+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:11.114596+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:12.114817+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:13.115006+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:14.115144+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:15.115310+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:16.115509+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:17.115674+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:18.115892+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:19.116084+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:20.116354+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:21.116604+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:22.116748+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:23.116853+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:24.117001+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:25.117151+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:26.117507+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:27.117856+0000)
Dec 01 09:39:32 compute-0 systemd[1]: Started libpod-conmon-d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3.scope.
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:28.118064+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:29.118232+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:30.118461+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:31.118595+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:32.118781+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:33.118917+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:34.119092+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:35.119369+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:36.119528+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:37.119648+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:38.119802+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:39.119924+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:40.120227+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:41.120370+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:42.120560+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:43.120712+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:44.121082+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:45.121228+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:46.121406+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:47.121550+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:48.121671+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:49.121834+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:50.122068+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:51.122268+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:52.122472+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:53.122600+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:54.122768+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:55.122935+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:56.123312+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:57.123441+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:58.123578+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:59.123715+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:00.123919+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:01.124322+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:02.124486+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:03.124736+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:04.124928+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:05.125078+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:06.125602+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:07.126045+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:08.126191+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:09.126627+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:10.127006+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:11.127668+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:12.128035+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:13.128420+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:14.128570+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:15.128720+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:16.128860+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:17.129108+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:18.129273+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:19.129451+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:20.129627+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:21.129809+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:22.129939+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:23.130100+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:24.130264+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:25.130404+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:26.130538+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:27.130705+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:28.130844+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:29.131051+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:30.131263+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:31.131774+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:32.131892+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:33.132060+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:34.133674+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:35.133798+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:36.134355+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:37.134495+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:38.134636+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:39.134775+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:40.134931+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:41.135427+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:42.135577+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:43.135756+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:44.135874+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:45.135986+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:46.136130+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:47.136311+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:48.136474+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:49.136630+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:50.136853+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:51.136980+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:52.137148+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:53.137303+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:54.137444+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:55.137657+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:56.137836+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:57.138082+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:58.138265+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:59.138399+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:00.138591+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:01.138707+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:02.138857+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:03.139363+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:04.139534+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:05.139704+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:06.139894+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:07.140316+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:08.140562+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 podman[261541]: 2025-12-01 09:39:32.732488067 +0000 UTC m=+0.031161808 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:09.140878+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:10.141098+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:11.141280+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:12.141435+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:13.141668+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:14.141897+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:15.142034+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:16.142268+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:17.142429+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:18.142667+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:19.142924+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:20.143154+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:21.143359+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:22.143536+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:23.143765+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:24.143936+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:25.144283+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:26.144566+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:27.144969+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:28.145239+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:29.145536+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:30.145873+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:31.146070+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:32.146238+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:33.146430+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:34.146602+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:35.146754+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:36.146889+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:37.147059+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:38.147239+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:39.147391+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:40.147605+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:41.147742+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:42.147895+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:43.148059+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:44.148263+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:45.148486+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:46.148635+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:47.148803+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:48.149010+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:49.149221+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:50.149474+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:51.149634+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:52.149776+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:53.149940+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:54.150139+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:55.150362+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:56.150553+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:57.150665+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:58.150783+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:59.150914+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:00.151092+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:01.151249+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:02.151548+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:03.151658+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:04.151926+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:05.152153+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:06.152350+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:07.152527+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:08.152698+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:09.152834+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:10.153089+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:11.153346+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:12.153543+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:13.153740+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:14.153873+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:15.154038+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:16.154251+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:17.154419+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:18.154562+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:19.154725+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:20.154911+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:21.155049+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:22.155190+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:23.155331+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:24.155534+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:25.155696+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:26.155876+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:27.156073+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:28.156235+0000)
Dec 01 09:39:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda5c0e15ffdc170d50a99b16dd0975927008be6e08265cc75de0eca91d44f3e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda5c0e15ffdc170d50a99b16dd0975927008be6e08265cc75de0eca91d44f3e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda5c0e15ffdc170d50a99b16dd0975927008be6e08265cc75de0eca91d44f3e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda5c0e15ffdc170d50a99b16dd0975927008be6e08265cc75de0eca91d44f3e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:29.156380+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Dec 01 09:39:32 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/629889039' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 09:39:32 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:30.156562+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:31.156845+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:32.157077+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:33.157249+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:34.157488+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:35.157652+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:36.157805+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:37.157935+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:38.158050+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:39.158235+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:40.158571+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:41.158711+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:42.158900+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:43.159032+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:44.159199+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:45.159399+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:46.159574+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:47.159733+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:48.159894+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:49.160036+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:50.160236+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:51.160376+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:52.160531+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:53.160762+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:54.160901+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:55.161023+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:56.161178+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:57.161423+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:58.161619+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:59.161834+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:00.163065+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:01.163208+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:02.163373+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:03.163530+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:04.163693+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:05.163846+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:06.163977+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:07.164145+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:08.164425+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:09.164641+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:10.164871+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:11.165047+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:12.165211+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:13.165429+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:14.165618+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:15.165800+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:16.165984+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:17.166157+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:18.166375+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:19.166530+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:20.167011+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:21.167221+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:22.167360+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:23.167494+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:24.167629+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:25.167777+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:26.167912+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:27.168065+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:28.168278+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:29.168506+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:30.168855+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:31.169012+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:32.169218+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:33.169399+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:34.169535+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:35.169669+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:36.169795+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:37.169923+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:38.170028+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:39.170151+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:40.170282+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:41.170452+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:42.170648+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:43.170817+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:44.170995+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:45.171170+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:46.171392+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:47.171614+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:48.171897+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:49.172044+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:50.172246+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:51.172432+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:52.172645+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:53.172874+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:54.173023+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:55.173224+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:56.173386+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:57.173547+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:58.173741+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:59.173867+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:00.174049+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:01.174243+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:02.174385+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:03.174524+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:04.174667+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:05.174866+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:06.175102+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:07.175268+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:08.175474+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:09.175733+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:10.175926+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:11.176089+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:12.176251+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:13.176426+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:14.176586+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:15.176753+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:16.176938+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:17.177232+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:18.177443+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:19.177618+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:20.177817+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:21.177962+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:22.178157+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:23.178378+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:24.178592+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:25.178775+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:26.178963+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:27.179168+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:28.179382+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:29.179571+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:30.179744+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:31.179981+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:32.180191+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:33.180374+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:34.180542+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:35.180755+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:36.180903+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:37.181221+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:38.181591+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:39.181737+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:40.181967+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:41.182141+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:42.182275+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:43.182443+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:44.182637+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:45.182815+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:46.182963+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:47.183161+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:48.183350+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:49.183583+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [47,48], i have 47, src has [1,48]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 1081.428955078s of 1081.454589844s, submitted: 8
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [48,48], i have 48, src has [1,48]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 48 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:50.183842+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 48 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60850176 unmapped: 958464 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 48 handle_osd_map epochs [48,49], i have 48, src has [1,49]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:51.184036+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 552245 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61202432 unmapped: 17391616 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 49 heartbeat osd_stat(store_statfs(0x4fe0e2000/0x0/0x4ffc00000, data 0x9f226/0xeb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 49 handle_osd_map epochs [49,50], i have 49, src has [1,50]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 50 handle_osd_map epochs [50,50], i have 50, src has [1,50]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 50 ms_handle_reset con 0x555f1b858c00 session 0x555f1abae780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:52.184225+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61243392 unmapped: 17350656 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:53.184413+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 50 handle_osd_map epochs [51,51], i have 50, src has [1,51]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 51 ms_handle_reset con 0x555f1b858000 session 0x555f1ab3f680
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:54.184585+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:55.184982+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 51 heartbeat osd_stat(store_statfs(0x4fc8d7000/0x0/0x4ffc00000, data 0x18a33eb/0x18f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:56.185186+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 614157 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:57.185359+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:58.185488+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:59.185662+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:00.185910+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 51 heartbeat osd_stat(store_statfs(0x4fc8d7000/0x0/0x4ffc00000, data 0x18a33eb/0x18f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.322311401s of 10.521731377s, submitted: 31
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:01.186075+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:02.186366+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:03.186564+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:04.186736+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:05.186933+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:06.187096+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:07.187274+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:08.187519+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:09.187763+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:10.188003+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:11.188173+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:12.188351+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:13.188547+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:14.188752+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:15.188970+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 podman[261541]: 2025-12-01 09:39:32.864487355 +0000 UTC m=+0.163161086 container init d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_yalow, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:16.189150+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:17.189430+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:18.189623+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:19.189788+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:20.190451+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:21.191033+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:22.192426+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:23.193262+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:24.193686+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:25.193847+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.412767410s of 25.425762177s, submitted: 13
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:26.194013+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 620555 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1b8eb400 session 0x555f1b28e5a0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61243392 unmapped: 17350656 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 53 heartbeat osd_stat(store_statfs(0x4fc8d2000/0x0/0x4ffc00000, data 0x18a5e55/0x18fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:27.194362+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1b8eb000 session 0x555f1aa64960
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa645a0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61259776 unmapped: 17334272 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:28.194789+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 53 heartbeat osd_stat(store_statfs(0x4fc8d2000/0x0/0x4ffc00000, data 0x18a5e55/0x18fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61259776 unmapped: 17334272 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:29.195081+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 53 handle_osd_map epochs [54,54], i have 53, src has [1,54]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa33a40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62332928 unmapped: 16261120 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1b858000 session 0x555f1aa32d20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1cb08800 session 0x555f1aa32000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:30.195390+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62332928 unmapped: 16261120 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:31.196266+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 624199 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 54 handle_osd_map epochs [55,55], i have 54, src has [1,55]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 55 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa32780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61358080 unmapped: 17235968 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:32.196508+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61390848 unmapped: 17203200 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:33.196700+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 55 heartbeat osd_stat(store_statfs(0x4fc8cc000/0x0/0x4ffc00000, data 0x18a8a1d/0x1901000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 55 handle_osd_map epochs [55,56], i have 55, src has [1,56]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 56 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa32f00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62455808 unmapped: 16138240 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:34.196971+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62455808 unmapped: 16138240 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:35.197209+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 56 handle_osd_map epochs [57,57], i have 56, src has [1,57]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 57 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa64780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62504960 unmapped: 16089088 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:36.197456+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 634040 data_alloc: 218103808 data_used: 28672
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62521344 unmapped: 16072704 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.331473351s of 11.440871239s, submitted: 32
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:37.197850+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 57 handle_osd_map epochs [58,58], i have 57, src has [1,58]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 58 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab53a40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 58 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa652c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 15613952 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:38.198035+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 58 handle_osd_map epochs [59,59], i have 58, src has [1,59]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 59 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab3e3c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 23699456 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:39.198255+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 59 heartbeat osd_stat(store_statfs(0x4fa8b9000/0x0/0x4ffc00000, data 0x38adc5c/0x3914000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 60 ms_handle_reset con 0x555f1b8eb000 session 0x555f1a5cde00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 60 ms_handle_reset con 0x555f1b858000 session 0x555f1c7ac780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 63479808 unmapped: 23511040 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:40.198693+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 60 handle_osd_map epochs [61,61], i have 60, src has [1,61]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 61 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab6e000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 22388736 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:41.198864+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 61 handle_osd_map epochs [61,62], i have 61, src has [1,62]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 62 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa610e0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1009042 data_alloc: 218103808 data_used: 45056
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 62 ms_handle_reset con 0x555f1cb08800 session 0x555f1aa5f860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 21241856 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:42.199046+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 62 handle_osd_map epochs [62,63], i have 62, src has [1,63]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb23000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 63 handle_osd_map epochs [63,63], i have 63, src has [1,63]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1cb23000 session 0x555f1ab532c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1cb09000 session 0x555f1b29cd20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1b8eb400 session 0x555f1c7ac5a0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 20971520 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:43.200606+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 20881408 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:44.200758+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa5f2c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1b858000 session 0x555f1ab53680
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3ef00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 20799488 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 65 heartbeat osd_stat(store_statfs(0x4fc89a000/0x0/0x4ffc00000, data 0x18b8e3b/0x1930000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:45.200905+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 66 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3e780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 19709952 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:46.201073+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 705756 data_alloc: 218103808 data_used: 65536
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 67 ms_handle_reset con 0x555f1b858000 session 0x555f1a5cd0e0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 19628032 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:47.201248+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.978338242s of 10.211093903s, submitted: 268
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 19439616 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 68 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa610e0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:48.201445+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 68 handle_osd_map epochs [68,69], i have 68, src has [1,69]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 69 ms_handle_reset con 0x555f1b8eb400 session 0x555f1c783c20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 69 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa32b40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 19349504 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:49.201611+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 69 handle_osd_map epochs [69,70], i have 69, src has [1,70]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 70 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab6fc20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 70 ms_handle_reset con 0x555f1b858000 session 0x555f1c783a40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 19234816 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:50.202110+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa64f00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67805184 unmapped: 19185664 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 71 heartbeat osd_stat(store_statfs(0x4fb2e2000/0x0/0x4ffc00000, data 0x18bf0bd/0x1936000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:51.202264+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 712002 data_alloc: 218103808 data_used: 73728
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67846144 unmapped: 19144704 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:52.202393+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1b8eb400 session 0x555f1ab3f860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67846144 unmapped: 19144704 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:53.202549+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3e780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 19161088 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3f4a0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:54.202714+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 19161088 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:55.202870+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 71 handle_osd_map epochs [73,73], i have 71, src has [1,73]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 71 handle_osd_map epochs [72,73], i have 71, src has [1,73]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 73 ms_handle_reset con 0x555f1b858000 session 0x555f1ab3fc20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fb2bd000/0x0/0x4ffc00000, data 0x18e5d82/0x1960000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69132288 unmapped: 17858560 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:56.203040+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 722395 data_alloc: 218103808 data_used: 81920
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:57.203187+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fb2bd000/0x0/0x4ffc00000, data 0x18e5d82/0x1960000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:58.203362+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:59.203470+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.536053658s of 12.151283264s, submitted: 168
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 74 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa614a0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69206016 unmapped: 17784832 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:00.203660+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa603c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 17735680 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:01.203888+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb22c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 731868 data_alloc: 218103808 data_used: 102400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb22c00 session 0x555f1aa330e0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 17547264 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:02.204415+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1b858000 session 0x555f1ab53e00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb08800 session 0x555f1b28e960
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 17547264 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:03.204779+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 75 heartbeat osd_stat(store_statfs(0x4fb2b7000/0x0/0x4ffc00000, data 0x18e8982/0x1966000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 76 handle_osd_map epochs [76,77], i have 76, src has [1,77]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 77 handle_osd_map epochs [77,77], i have 77, src has [1,77]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69484544 unmapped: 17506304 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 77 ms_handle_reset con 0x555f1cb08c00 session 0x555f1c782960
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:04.205001+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 78 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab661e0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 17498112 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:05.205155+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 78 handle_osd_map epochs [78,79], i have 78, src has [1,79]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 heartbeat osd_stat(store_statfs(0x4fb2af000/0x0/0x4ffc00000, data 0x18eb9f2/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 podman[261541]: 2025-12-01 09:39:32.872120895 +0000 UTC m=+0.170794606 container start d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_yalow, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:39:32 compute-0 podman[261541]: 2025-12-01 09:39:32.878317023 +0000 UTC m=+0.176990754 container attach d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_yalow, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:06.205373+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 748246 data_alloc: 218103808 data_used: 102400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:07.205605+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb22800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb22800 session 0x555f1ab67860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1b858000 session 0x555f1ab66d20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab66000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:08.205780+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08c00 session 0x555f1ab66b40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb22400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb22400 session 0x555f1aa32b40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab67c20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1b858000 session 0x555f1c782000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab66b40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09c00 session 0x555f1ab67860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 17399808 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09800 session 0x555f1aa61c20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:09.205980+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 heartbeat osd_stat(store_statfs(0x4fb2aa000/0x0/0x4ffc00000, data 0x18ee51a/0x1974000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 17399808 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:10.206530+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.622175217s of 10.975051880s, submitted: 122
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 80 ms_handle_reset con 0x555f1cb09400 session 0x555f1ab6e3c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 17080320 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:11.206702+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 757501 data_alloc: 218103808 data_used: 114688
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 80 heartbeat osd_stat(store_statfs(0x4fb281000/0x0/0x4ffc00000, data 0x1913a15/0x199c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:12.206891+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 80 heartbeat osd_stat(store_statfs(0x4fb281000/0x0/0x4ffc00000, data 0x1913a15/0x199c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:13.207062+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:14.207268+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:15.207435+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 80 ms_handle_reset con 0x555f1cb09800 session 0x555f1abae3c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69836800 unmapped: 17154048 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:16.207585+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb09c00 session 0x555f1ab3e960
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb08000 session 0x555f1aa652c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb08c00 session 0x555f1b4b92c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb23400
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb23400 session 0x555f1aa60780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 766180 data_alloc: 218103808 data_used: 135168
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69959680 unmapped: 17031168 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:17.207719+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 82 ms_handle_reset con 0x555f1cb08000 session 0x555f1ab6fc20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 16973824 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:18.207885+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb08c00 session 0x555f1ab66d20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 15908864 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 83 heartbeat osd_stat(store_statfs(0x4fb276000/0x0/0x4ffc00000, data 0x1917fc3/0x19a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:19.208046+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb09800 session 0x555f1c782000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb09c00 session 0x555f1aa60780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb23c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb23c00 session 0x555f1aa652c0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 15892480 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:20.208236+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 15892480 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb08000 session 0x555f1ab67860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:21.208441+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.769915581s of 10.991518974s, submitted: 90
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 771276 data_alloc: 218103808 data_used: 139264
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 15884288 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:22.208653+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1cb08c00 session 0x555f1b91a780
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 15859712 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:23.208849+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1b858000 session 0x555f1b28e960
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab6f860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 84 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x19191dd/0x19a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 15826944 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:24.209058+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72237056 unmapped: 14753792 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:25.209213+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72278016 unmapped: 14712832 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:26.209349+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1cb09800 session 0x555f1b29cd20
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 776985 data_alloc: 218103808 data_used: 139264
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72302592 unmapped: 14688256 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:27.209535+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 14671872 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:28.209777+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 14671872 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:29.211060+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab53a40
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b8eb400 session 0x555f1abaf860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 86 heartbeat osd_stat(store_statfs(0x4fb293000/0x0/0x4ffc00000, data 0x18f7c42/0x1988000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b858000 session 0x555f1b29d680
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 14639104 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:30.211226+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 14639104 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:31.211415+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.348530769s of 10.076562881s, submitted: 127
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 87 ms_handle_reset con 0x555f1cb08000 session 0x555f1b4b9860
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 775824 data_alloc: 218103808 data_used: 143360
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72368128 unmapped: 14622720 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:32.211534+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 88 ms_handle_reset con 0x555f1cb08800 session 0x555f1c782960
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:33.211648+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:34.211785+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fb2b5000/0x0/0x4ffc00000, data 0x18d66d9/0x1967000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:35.211934+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:36.212092+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 777730 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:37.212824+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:38.212990+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fb2b5000/0x0/0x4ffc00000, data 0x18d66d9/0x1967000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 88 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:39.213163+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 89 heartbeat osd_stat(store_statfs(0x4fb2b3000/0x0/0x4ffc00000, data 0x18d7b95/0x196a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:40.213347+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:41.213495+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 779854 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:42.213644+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.189030647s of 11.284521103s, submitted: 85
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b3000/0x0/0x4ffc00000, data 0x18d7b95/0x196a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:43.213775+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:44.213932+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:45.214089+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:46.214348+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:47.214593+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:48.214738+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:49.214911+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:50.215160+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:51.215432+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:52.215585+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:53.215723+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:54.215866+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:55.216154+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:56.216280+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:57.216403+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:58.216532+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:59.216670+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:00.216864+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:01.216973+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:02.217108+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:03.217276+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:04.217440+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:05.217585+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:06.217745+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:07.217902+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:08.218584+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:09.218741+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:10.218992+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:11.219183+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:12.219368+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:13.219546+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:14.219740+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:15.219921+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:16.220429+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:17.220707+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:18.221010+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:19.221173+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:20.221468+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:21.221654+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:22.221835+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:23.222077+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:24.222339+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:25.222542+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:26.222808+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:27.222975+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:28.223137+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:29.223327+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:30.224003+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:31.224189+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:32.228920+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:33.233185+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:34.236264+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:35.236970+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:36.238224+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:37.238864+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:38.241145+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:39.243172+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:40.243800+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:41.245062+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:42.245386+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:43.245682+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:44.245980+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:45.246186+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:46.246419+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:47.246601+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:48.246770+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:49.246932+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:50.247260+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:51.247441+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:52.247612+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:53.247825+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:54.247997+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:55.248172+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:56.248324+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:57.248476+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:58.248642+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:59.248790+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72613888 unmapped: 14376960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'config show' '{prefix=config show}'
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:00.249023+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 14114816 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:01.249180+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 14147584 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:32 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:32 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:39:32 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:02.249324+0000)
Dec 01 09:39:32 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 13893632 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:32 compute-0 ceph-osd[89052]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:39:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Dec 01 09:39:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4187076037' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 09:39:33 compute-0 rsyslogd[1007]: imjournal from <np0005540741:ceph-osd>: begin to drop messages due to rate-limiting
Dec 01 09:39:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Dec 01 09:39:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2863532065' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:39:33 compute-0 ceph-mon[75031]: pgmap v870: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:33 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1694800268' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 09:39:33 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/629889039' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 09:39:33 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4187076037' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 09:39:33 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2863532065' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:39:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Dec 01 09:39:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2738387335' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 09:39:33 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Dec 01 09:39:33 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3056155381' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]: {
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:     "0": [
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:         {
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "devices": [
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "/dev/loop3"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             ],
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_name": "ceph_lv0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_size": "21470642176",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "name": "ceph_lv0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "tags": {
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cluster_name": "ceph",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.crush_device_class": "",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.encrypted": "0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osd_id": "0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.type": "block",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.vdo": "0"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             },
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "type": "block",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "vg_name": "ceph_vg0"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:         }
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:     ],
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:     "1": [
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:         {
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "devices": [
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "/dev/loop4"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             ],
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_name": "ceph_lv1",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_size": "21470642176",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "name": "ceph_lv1",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "tags": {
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cluster_name": "ceph",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.crush_device_class": "",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.encrypted": "0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osd_id": "1",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.type": "block",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.vdo": "0"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             },
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "type": "block",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "vg_name": "ceph_vg1"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:         }
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:     ],
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:     "2": [
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:         {
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "devices": [
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "/dev/loop5"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             ],
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_name": "ceph_lv2",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_size": "21470642176",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "name": "ceph_lv2",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "tags": {
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.cluster_name": "ceph",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.crush_device_class": "",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.encrypted": "0",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osd_id": "2",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.type": "block",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:                 "ceph.vdo": "0"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             },
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "type": "block",
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:             "vg_name": "ceph_vg2"
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:         }
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]:     ]
Dec 01 09:39:33 compute-0 quizzical_yalow[261575]: }
Dec 01 09:39:33 compute-0 systemd[1]: libpod-d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3.scope: Deactivated successfully.
Dec 01 09:39:33 compute-0 podman[261541]: 2025-12-01 09:39:33.727877889 +0000 UTC m=+1.026551630 container died d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_yalow, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:39:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v871: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-dda5c0e15ffdc170d50a99b16dd0975927008be6e08265cc75de0eca91d44f3e-merged.mount: Deactivated successfully.
Dec 01 09:39:33 compute-0 podman[261541]: 2025-12-01 09:39:33.802455515 +0000 UTC m=+1.101129226 container remove d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_yalow, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:39:33 compute-0 systemd[1]: libpod-conmon-d47e3746fc9187e59d95a88b5eac2c80c66dce35c0de6ea7bb2f386d9da1aea3.scope: Deactivated successfully.
Dec 01 09:39:33 compute-0 sudo[261361]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:33 compute-0 podman[261715]: 2025-12-01 09:39:33.870087722 +0000 UTC m=+0.105199239 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 01 09:39:33 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14820 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:33 compute-0 sudo[261748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:33 compute-0 sudo[261748]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:33 compute-0 sudo[261748]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:33 compute-0 sudo[261780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:39:33 compute-0 sudo[261780]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:33 compute-0 sudo[261780]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:33 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14822 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:34 compute-0 sudo[261808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:34 compute-0 sudo[261808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:34 compute-0 sudo[261808]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:39:34 compute-0 sudo[261855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:39:34 compute-0 sudo[261855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:34 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14824 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.381 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.381 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.382 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.382 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.382 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:34 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14826 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:34 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2738387335' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 09:39:34 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3056155381' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 09:39:34 compute-0 podman[261957]: 2025-12-01 09:39:34.479901029 +0000 UTC m=+0.047804876 container create 87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_panini, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:39:34 compute-0 systemd[1]: Started libpod-conmon-87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf.scope.
Dec 01 09:39:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:39:34 compute-0 podman[261957]: 2025-12-01 09:39:34.460515411 +0000 UTC m=+0.028419278 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:39:34 compute-0 podman[261957]: 2025-12-01 09:39:34.568420716 +0000 UTC m=+0.136324573 container init 87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_panini, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:39:34 compute-0 podman[261957]: 2025-12-01 09:39:34.579350691 +0000 UTC m=+0.147254538 container start 87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_panini, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:39:34 compute-0 podman[261957]: 2025-12-01 09:39:34.582572104 +0000 UTC m=+0.150475971 container attach 87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_panini, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:39:34 compute-0 distracted_panini[261986]: 167 167
Dec 01 09:39:34 compute-0 systemd[1]: libpod-87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf.scope: Deactivated successfully.
Dec 01 09:39:34 compute-0 podman[261957]: 2025-12-01 09:39:34.58974432 +0000 UTC m=+0.157648187 container died 87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_panini, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:39:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-67e8729228eea99f15b209ed416927b1f1d0c47a6053312d461f783943111c23-merged.mount: Deactivated successfully.
Dec 01 09:39:34 compute-0 podman[261957]: 2025-12-01 09:39:34.625756206 +0000 UTC m=+0.193660053 container remove 87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_panini, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Dec 01 09:39:34 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14828 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:34 compute-0 systemd[1]: libpod-conmon-87a513e4ce80bd6995668db100cd38535f6823c00f52c3c714056cb61422d1cf.scope: Deactivated successfully.
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.707 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.709 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.710 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.710 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:39:34 compute-0 nova_compute[250706]: 2025-12-01 09:39:34.711 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:39:34 compute-0 podman[262035]: 2025-12-01 09:39:34.789785816 +0000 UTC m=+0.040102265 container create 3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_fermi, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:39:34 compute-0 systemd[1]: Started libpod-conmon-3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184.scope.
Dec 01 09:39:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:39:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23f6c77864f59169a1e3035657990402d96e9495be6d2ce555d3ecc68cf0f508/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23f6c77864f59169a1e3035657990402d96e9495be6d2ce555d3ecc68cf0f508/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23f6c77864f59169a1e3035657990402d96e9495be6d2ce555d3ecc68cf0f508/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23f6c77864f59169a1e3035657990402d96e9495be6d2ce555d3ecc68cf0f508/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:39:34 compute-0 podman[262035]: 2025-12-01 09:39:34.772369765 +0000 UTC m=+0.022686244 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:39:34 compute-0 podman[262035]: 2025-12-01 09:39:34.88273779 +0000 UTC m=+0.133054249 container init 3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_fermi, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:39:34 compute-0 podman[262035]: 2025-12-01 09:39:34.893357746 +0000 UTC m=+0.143674205 container start 3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_fermi, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:39:34 compute-0 podman[262035]: 2025-12-01 09:39:34.896226618 +0000 UTC m=+0.146543067 container attach 3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_fermi, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:39:35 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14832 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:39:35 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1425841886' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.226 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:39:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Dec 01 09:39:35 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/852801530' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.401 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.403 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4949MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.403 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.404 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:39:35 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14838 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: pgmap v871: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:35 compute-0 ceph-mon[75031]: from='client.14820 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: from='client.14822 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: from='client.14824 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: from='client.14826 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1425841886' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/852801530' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.558 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.558 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:39:35 compute-0 nova_compute[250706]: 2025-12-01 09:39:35.593 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:39:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "versions"} v 0) v1
Dec 01 09:39:35 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3090802399' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 09:39:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v872: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:35 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14842 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:35 compute-0 nervous_fermi[262069]: {
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "osd_id": 0,
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "type": "bluestore"
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:     },
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "osd_id": 1,
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "type": "bluestore"
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:     },
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "osd_id": 2,
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:         "type": "bluestore"
Dec 01 09:39:35 compute-0 nervous_fermi[262069]:     }
Dec 01 09:39:35 compute-0 nervous_fermi[262069]: }
Dec 01 09:39:35 compute-0 systemd[1]: libpod-3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184.scope: Deactivated successfully.
Dec 01 09:39:35 compute-0 systemd[1]: libpod-3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184.scope: Consumed 1.033s CPU time.
Dec 01 09:39:35 compute-0 podman[262035]: 2025-12-01 09:39:35.98644204 +0000 UTC m=+1.236758489 container died 3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_fermi, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default)
Dec 01 09:39:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-23f6c77864f59169a1e3035657990402d96e9495be6d2ce555d3ecc68cf0f508-merged.mount: Deactivated successfully.
Dec 01 09:39:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:39:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/851619639' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:39:36 compute-0 podman[262035]: 2025-12-01 09:39:36.058652358 +0000 UTC m=+1.308968807 container remove 3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_fermi, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:39:36 compute-0 systemd[1]: libpod-conmon-3ce231b0bf31c0551fa2e8386f22eb8a2ec1e5918c2faf085a08e0e6c4884184.scope: Deactivated successfully.
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.079 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.084 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:39:36 compute-0 sudo[261855]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:39:36 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:39:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.115 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.117 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.117 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:39:36 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:39:36 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 0c050f7f-cce4-4ef3-88f8-7df6d6d6df2d does not exist
Dec 01 09:39:36 compute-0 sudo[262314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:39:36 compute-0 sudo[262314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:36 compute-0 sudo[262314]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:36 compute-0 sudo[262339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:39:36 compute-0 sudo[262339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:39:36 compute-0 sudo[262339]: pam_unix(sudo:session): session closed for user root
Dec 01 09:39:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Dec 01 09:39:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2853296270' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14848 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='client.14828 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='client.14832 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='client.14838 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3090802399' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/851619639' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:39:36 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2853296270' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Dec 01 09:39:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2162815706' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.787 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.788 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:39:36 compute-0 nova_compute[250706]: 2025-12-01 09:39:36.788 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:39:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 09:39:36 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100702286s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961563110s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100702286s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961563110s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.899159 11 0.000061
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.907846 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.907908 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.907939 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100515366s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961517334s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] exit Reset 0.000028 1 0.000051
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100497246s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961517334s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.899132 11 0.000110
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.907620 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.947257 4 0.000051
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.958917 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959001 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.907682 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959026 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.907718 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052310944s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913497925s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] exit Reset 0.000052 1 0.000123
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100352287s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961570740s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.052274704s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913497925s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] exit Reset 0.000082 1 0.000178
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.100304604s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961570740s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.899823 11 0.000070
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.908555 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.908651 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.908673 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099813461s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961418152s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] exit Reset 0.000042 1 0.000089
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099791527s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961418152s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.900541 11 0.000057
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.909353 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.948301 4 0.000024
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.909412 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.909439 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959772 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959863 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959899 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099128723s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961410522s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.948382 4 0.000047
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.899714 11 0.000066
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959829 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] exit Reset 0.000064 1 0.000099
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.909228 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959885 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.909288 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959905 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] exit Start 0.000019 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.909328 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099099159s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961410522s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051272392s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913627625s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099069595s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961448669s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051211357s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913597107s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] exit Reset 0.000061 1 0.000939
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.099026680s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961448669s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] exit Reset 0.000219 1 0.000304
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051125526s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913627625s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.900912 11 0.000068
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.909720 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.909784 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.948647 4 0.000029
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.909808 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959972 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.960027 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.960044 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098785400s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961402893s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.051002502s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913635254s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] exit Reset 0.000050 1 0.000073
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] exit Reset 0.000051 1 0.000073
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098758698s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961402893s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050975800s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913635254s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.948742 4 0.000042
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.960052 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.960128 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.960164 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050869942s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913658142s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] exit Reset 0.000053 1 0.000085
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.948818 4 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050841331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913658142s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.900751 11 0.000510
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.960058 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.910112 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.960115 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.910210 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.960157 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.910232 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098507881s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961387634s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.900906 11 0.000088
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050839424s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913719177s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.909244 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] exit Reset 0.000062 1 0.000088
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] exit Reset 0.000067 1 0.000095
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.910402 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.910458 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] exit Start 0.000016 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050799370s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913719177s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] exit Start 0.000027 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098623276s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961585999s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098463058s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961387634s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] exit Reset 0.000072 1 0.000099
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.948986 4 0.000046
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098595619s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961585999s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.960057 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.960110 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.960127 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.901486 11 0.000160
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.910704 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.910808 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.910862 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050669670s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.913757324s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] exit Reset 0.000046 1 0.000070
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098088264s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961196899s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.938226 4 0.000035
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959672 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] exit Reset 0.000044 1 0.000068
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.050644875s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913757324s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959778 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959827 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.098061562s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961196899s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061552048s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924736023s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] exit Reset 0.000049 1 0.000082
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.901759 11 0.000138
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.910979 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.911091 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.911114 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.937937 4 0.000034
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959016 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959165 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959191 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097998619s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961250305s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061717033s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924995422s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] exit Reset 0.000046 1 0.000089
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] exit Reset 0.000046 1 0.000066
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] exit Start 0.000018 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097971916s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961250305s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] exit Start 0.000142 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061531067s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924736023s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061690331s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924995422s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.901983 11 0.000056
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.911243 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.911336 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.911356 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.901879 11 0.000056
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.911141 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.911324 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097698212s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961128235s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.911381 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902321 11 0.000071
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.911482 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.911525 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.911543 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] exit Reset 0.000043 1 0.000067
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097724915s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961204529s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097575188s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961067200s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] exit Start 0.000017 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097672462s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961128235s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] exit Reset 0.000040 1 0.000063
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] exit Reset 0.000058 1 0.000080
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097699165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961204529s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.938361 4 0.000047
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959675 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959755 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959782 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.901986 11 0.000110
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061326981s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.924926758s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.910996 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] exit Start 0.000091 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097554207s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961067200s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.911770 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.911788 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] exit Reset 0.000048 1 0.000068
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061299324s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.924926758s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097580910s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961235046s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] exit Reset 0.000112 1 0.000081
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] exit Start 0.000018 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097556114s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961235046s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902862 11 0.000060
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.912216 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.912295 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.912321 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097126961s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.960968018s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.937735 4 0.000037
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.957522 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] exit Reset 0.000043 1 0.000077
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959861 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.902667 11 0.000067
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959902 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.911984 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097102165s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.960968018s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.912369 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.912402 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097393990s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961318970s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] exit Reset 0.000087 1 0.000077
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.938706 4 0.000033
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.097324371s) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961318970s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959490 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959621 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959662 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061881065s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.926002502s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 8.903077 11 0.000089
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 8.912414 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 8.912606 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060898781s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925033569s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] exit Started 8.912693 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=41) [0] r=0 lpr=41 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] exit Reset 0.000082 1 0.000288
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.061853409s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.926002502s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] exit Reset 0.000076 1 0.000136
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096854210s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active pruub 91.961059570s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060843468s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925033569s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] exit Reset 0.000061 1 0.000092
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46 pruub=15.096827507s) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.961059570s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.938899 4 0.000021
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959425 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959530 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959573 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.938910 4 0.000023
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060742378s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925109863s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959308 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959383 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959403 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] exit Reset 0.000047 1 0.000071
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 6.938915 4 0.000031
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060762405s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925170898s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 6.959544 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 6.959733 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] exit Started 6.959753 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060591698s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active pruub 85.925079346s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] exit Reset 0.000162 1 0.000276
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] exit Reset 0.000042 1 0.000073
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] exit Start 0.000018 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060739517s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925170898s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] exit Start 0.000014 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060714722s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925109863s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.060490608s) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.925079346s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] exit Reset 0.003759 1 0.003789
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 9) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] exit Start 0.000031 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46 pruub=9.047507286s) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 85.913597107s@ mbc={}] enter Started/Stray
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:26.586243+0000 osd.0 (osd.0) 8 : cluster [DBG] 4.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:26.599444+0000 osd.0 (osd.0) 9 : cluster [DBG] 4.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 46 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000075 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000113 1 0.000065
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000036 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000014
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000073 1 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000026 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000088 1 0.000113
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000057 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000012
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000061 1 0.000066
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001084 2 0.000055
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000802 2 0.000036
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000039 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000020
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000069 1 0.000032
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000019 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000019
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000034 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000022 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000013
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000045 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000017 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000012
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000047 1 0.000029
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000062 1 0.000033
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000043 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000017
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000041 1 0.000031
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000021 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000016
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000040 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000067 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000021 1 0.000033
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000118 1 0.000044
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000027 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000344 1 0.000042
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000046 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000017
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000096 1 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000029 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000018
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000086 1 0.000049
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000020 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=0 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000013
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000015 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000062 1 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=0/0 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000018 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000055 1 0.000031
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005222 2 0.000030
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004995 2 0.000037
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003670 2 0.000031
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003554 2 0.000018
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003431 2 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000072 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000198 1 0.000049
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000040 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000021
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000095 1 0.000042
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000023
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000103 1 0.000045
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004735 2 0.000029
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003804 2 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003617 2 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003503 2 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003187 2 0.000056
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002758 2 0.000030
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003091 2 0.000042
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002976 2 0.000044
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002587 2 0.000034
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002838 2 0.000035
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001937 2 0.000133
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001707 2 0.000088
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001386 2 0.000037
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 46 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000081 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000022 1 0.000041
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000133 1 0.000104
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000043 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000172 1 0.000032
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000020 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000009
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000023 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000459 1 0.000070
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000018 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000016
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000060 1 0.000033
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000090 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000279 1 0.000043
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000268 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000015 1 0.000076
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000124 1 0.000070
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000018 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000008
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000060 1 0.000027
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000015 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000008
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000057 1 0.000030
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000054 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000014
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000094 1 0.000106
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000412 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000016
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000054 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000021 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000010
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000052 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000013 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000097 1 0.000024
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000050 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000017
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000172 1 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000029 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000393 1 0.000451
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000094 1 0.000348
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000016 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000009
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000052 1 0.000037
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000023 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000019
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000084 1 0.000042
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000026 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000012
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000334 1 0.000033
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000022 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000354
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000413 1 0.000031
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000024 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=0 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000010
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000011 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000065 1 0.000046
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f(unlocked)] enter Initial
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000216 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=0 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000027 1 0.000018
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000075 1 0.000057
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: unregistering
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008423 2 0.000097
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008100 2 0.000046
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007490 2 0.000074
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007273 2 0.000036
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009516 2 0.000055
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009087 2 0.000047
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008933 2 0.000030
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008807 2 0.000353
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008175 2 0.000067
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007646 2 0.000021
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007572 2 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007446 2 0.000308
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006967 2 0.000148
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005928 2 0.000038
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005777 2 0.000039
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007200 2 0.000044
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005640 2 0.000027
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005227 2 0.000033
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004368 2 0.000037
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.003696 2 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004125 2 0.000024
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57180160 unmapped: 2531328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:57.794354+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 11 sent 9 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:27.605632+0000 osd.0 (osd.0) 10 : cluster [DBG] 4.6 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:27.619802+0000 osd.0 (osd.0) 11 : cluster [DBG] 4.6 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 46 handle_osd_map epochs [46,47], i have 46, src has [1,47]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076978 2 0.000043
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.085585 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.073612 2 0.000020
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.077412 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.073658 2 0.000031
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.077899 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.107793 2 0.000019
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.111135 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109554 2 0.000021
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.113166 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 handle_osd_map epochs [47,47], i have 47, src has [1,47]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109668 2 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.113443 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109733 2 0.000024
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.113249 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077672 2 0.000021
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.085045 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077868 2 0.000021
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.086178 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077858 2 0.000023
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.085847 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.110226 2 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.115347 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.074890 2 0.000027
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081107 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.108069 2 0.000041
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.111439 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.108169 2 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.110852 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075001 2 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081027 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075469 2 0.000018
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.084358 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075319 2 0.000029
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.082367 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109071 2 0.000019
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.112210 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.075446 2 0.000032
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.083129 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 11) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:27.605632+0000 osd.0 (osd.0) 10 : cluster [DBG] 4.6 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:27.619802+0000 osd.0 (osd.0) 11 : cluster [DBG] 4.6 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.108592 2 0.000038
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.111586 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076005 2 0.000038
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.083749 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109653 2 0.000020
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.113344 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.108764 2 0.000018
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.108857 2 0.000046
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.112015 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109866 2 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.114686 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.108903 2 0.000020
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.110434 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.110593 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109855 2 0.000018
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076284 2 0.000044
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.113438 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.084622 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076532 2 0.000045
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.085792 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076107 2 0.000029
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081709 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.109256 2 0.000033
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.111437 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.110288 2 0.000053
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.114182 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076499 2 0.000032
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.084230 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076891 2 0.000024
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.085917 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076612 2 0.000030
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.084026 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.077129 2 0.000038
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.086979 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076627 2 0.000037
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.081458 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=43/45 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.076828 2 0.000041
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.082611 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.112548 2 0.000034
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.117901 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.117355 2 0.000065
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.118612 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=41/42 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.118909 2 0.000927
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.120724 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=39/40 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010803 4 0.001317
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010623 4 0.000091
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010532 4 0.000054
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.012276 4 0.000127
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017498 4 0.000087
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017419 4 0.000066
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000022 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017313 4 0.000069
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017343 4 0.000079
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018219 4 0.000072
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=41/41 les/c/f=42/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018300 4 0.000210
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017773 4 0.000239
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018084 4 0.000258
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017559 4 0.000054
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017717 4 0.000277
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017552 4 0.000120
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017416 4 0.000163
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017800 4 0.000095
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017264 4 0.000161
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017159 4 0.000097
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017145 4 0.000119
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017081 4 0.000070
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017218 4 0.000208
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016987 4 0.000108
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016911 4 0.000075
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016872 4 0.000094
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016859 4 0.000074
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016811 4 0.000613
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016768 4 0.000080
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016664 4 0.000064
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016775 4 0.000193
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016634 4 0.000070
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018250 4 0.001006
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017277 4 0.000778
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000026 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016444 4 0.000104
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=43/43 les/c/f=45/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016697 4 0.000112
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=39/23 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016713 4 0.000096
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.1f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.014950 4 0.001693
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.014039 4 0.000082
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.016531 4 0.000144
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=41/27 lis/c=46/41 les/c/f=47/42/0 sis=46) [0] r=0 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.018438 4 0.001681
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=39/39 les/c/f=40/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.017680 4 0.001489
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=39/21 lis/c=46/39 les/c/f=47/40/0 sis=46) [0] r=0 lpr=46 pi=[39,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=43/31 lis/c=46/43 les/c/f=47/45/0 sis=46) [0] r=0 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.146312 7 0.000649
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.148549 7 0.000085
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.150151 7 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000138 1 0.000046
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000199 1 0.000020
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000235 1 0.000013
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.161438 7 0.000092
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164459 7 0.000071
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.160778 7 0.000419
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164792 7 0.000049
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000074 1 0.000059
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000174 1 0.000122
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.162419 7 0.000169
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165274 7 0.000074
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.162456 7 0.000070
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.166127 7 0.000098
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.163702 7 0.000120
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.166716 7 0.000095
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.166177 7 0.000051
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164453 7 0.000100
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164114 7 0.000108
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.163369 7 0.000095
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.163105 7 0.000132
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165550 7 0.000201
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001088 1 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001254 1 0.000183
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001048 1 0.000705
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000464 1 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000518 1 0.000035
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000550 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001381 1 0.000824
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000669 1 0.000051
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000722 1 0.000062
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000750 1 0.000034
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000888 1 0.000023
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000996 1 0.000054
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001858 1 0.001021
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001919 1 0.001405
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.168372 7 0.000057
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.163989 7 0.000290
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.168208 7 0.000063
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165333 7 0.000091
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165895 7 0.000052
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164847 7 0.000064
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165216 7 0.000064
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165624 7 0.000065
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165808 7 0.000149
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164596 7 0.000867
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165633 7 0.000083
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165453 7 0.000198
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165500 7 0.000069
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.166310 7 0.000097
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.167202 7 0.000075
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.162683 7 0.000117
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164658 7 0.000131
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.166220 7 0.000107
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.165951 7 0.000113
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000418 1 0.000042
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000439 1 0.000034
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000508 1 0.000032
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000527 1 0.000017
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000554 1 0.000015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000583 1 0.000017
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000615 1 0.000016
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000642 1 0.000017
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000675 1 0.000020
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000722 1 0.000015
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000745 1 0.000024
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000780 1 0.000019
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000806 1 0.000014
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000851 1 0.000014
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000875 1 0.000011
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000935 1 0.000014
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.164661 7 0.000116
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000987 1 0.000013
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.001288 1 0.000281
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000270 1 0.000922
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000193 1 0.001223
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.023430 1 0.000072
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.023608 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1c( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.170184 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.029021 1 0.000030
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.029259 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.10( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.177843 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.032395 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.032714 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.12( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.182888 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.024027 1 0.000057
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.024140 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1f( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.185280 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57663488 unmapped: 2048000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.031688 1 0.000800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.031923 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1c( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.193434 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.037971 1 0.000057
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.039142 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.11( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.203969 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.045224 1 0.000049
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.046521 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.13( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.211043 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.052315 1 0.000107
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.053463 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.a( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.216005 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.059661 1 0.000045
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.060172 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.223913 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.067183 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.067757 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.14( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.234510 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.074338 1 0.000040
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.074927 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.13( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.241140 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.081611 1 0.000032
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.083024 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.11( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.248331 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.088909 1 0.000050
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.089631 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.e( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.254129 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.097239 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.098023 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.f( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.262229 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.103654 1 0.000043
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.104452 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1b( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.267601 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.110955 1 0.000092
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.111897 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.18( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.277497 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.118195 1 0.000055
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.119257 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.1a( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [2] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.282672 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.125407 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.127297 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.15( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.293467 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.133034 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.135001 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.8( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [2] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.297503 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.139392 1 0.000037
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.139845 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.17( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.308262 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.146621 1 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.147106 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.14( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.315335 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.153924 1 0.000027
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.154467 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.b( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.319867 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.161405 1 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.161981 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.e( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.327905 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.168794 1 0.000028
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.169387 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.7( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.334267 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.176065 1 0.000023
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.176683 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.5( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.341933 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.183314 1 0.000025
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.183957 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.349613 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.190869 1 0.000024
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.191555 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.2( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.357415 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.198198 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.198931 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.4( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.363738 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.205713 1 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.206487 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.4( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.372155 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.213140 1 0.000031
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.213938 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.6( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.379556 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.220227 1 0.000022
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.221075 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.9( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.386619 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.227985 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.228828 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.f( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.396056 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.235258 1 0.000027
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.236162 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.c( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.398914 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.242493 1 0.000026
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.243422 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1e( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.408111 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.249597 1 0.000044
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.250579 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.d( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.416829 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.256923 1 0.000029
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.257944 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.2( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.423935 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.264239 1 0.000027
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.265556 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.1d( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.429591 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.271797 1 0.000029
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.273010 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[6.d( empty lb MIN local-lis/les=43/44 n=0 ec=43/29 lis/c=43/43 les/c/f=44/44/0 sis=46) [1] r=-1 lpr=46 pi=[43,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.439352 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 DELETING pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.279074 1 0.000027
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.279320 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 pg_epoch: 47 pg[4.8( empty lb MIN local-lis/les=41/42 n=0 ec=41/25 lis/c=41/41 les/c/f=42/42/0 sis=46) [1] r=-1 lpr=46 pi=[41,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 1.445226 0 0.000000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:58.794578+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57720832 unmapped: 1990656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:14:59.794754+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 13 sent 11 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:29.639057+0000 osd.0 (osd.0) 12 : cluster [DBG] 4.b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:29.653177+0000 osd.0 (osd.0) 13 : cluster [DBG] 4.b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57745408 unmapped: 1966080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 337551 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 13) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:29.639057+0000 osd.0 (osd.0) 12 : cluster [DBG] 4.b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:29.653177+0000 osd.0 (osd.0) 13 : cluster [DBG] 4.b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:00.794960+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 1908736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.c scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.c scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:01.795101+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 15 sent 13 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:31.590746+0000 osd.0 (osd.0) 14 : cluster [DBG] 4.c scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:31.604847+0000 osd.0 (osd.0) 15 : cluster [DBG] 4.c scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 1867776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:02.795373+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 15) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:31.590746+0000 osd.0 (osd.0) 14 : cluster [DBG] 4.c scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:31.604847+0000 osd.0 (osd.0) 15 : cluster [DBG] 4.c scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 1835008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:03.795529+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 1835008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:04.795695+0000)
Dec 01 09:39:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump"} v 0) v1
Dec 01 09:39:37 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2980152995' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57892864 unmapped: 1818624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 346418 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:05.795882+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57933824 unmapped: 1777664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:06.796129+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57933824 unmapped: 1777664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.322055817s of 13.008253098s, submitted: 385
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:07.796284+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 17 sent 15 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:37.552272+0000 osd.0 (osd.0) 16 : cluster [DBG] 4.15 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:37.566334+0000 osd.0 (osd.0) 17 : cluster [DBG] 4.15 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 17) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:37.552272+0000 osd.0 (osd.0) 16 : cluster [DBG] 4.15 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:37.566334+0000 osd.0 (osd.0) 17 : cluster [DBG] 4.15 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 1753088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:08.796556+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 1744896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:09.796721+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 19 sent 17 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:39.543425+0000 osd.0 (osd.0) 18 : cluster [DBG] 4.16 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:39.557700+0000 osd.0 (osd.0) 19 : cluster [DBG] 4.16 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 19) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:39.543425+0000 osd.0 (osd.0) 18 : cluster [DBG] 4.16 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:39.557700+0000 osd.0 (osd.0) 19 : cluster [DBG] 4.16 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57942016 unmapped: 1769472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 348714 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:10.796951+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57950208 unmapped: 1761280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:11.797113+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57950208 unmapped: 1761280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:12.797285+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 21 sent 19 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:42.592918+0000 osd.0 (osd.0) 20 : cluster [DBG] 4.17 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:42.606999+0000 osd.0 (osd.0) 21 : cluster [DBG] 4.17 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 21) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:42.592918+0000 osd.0 (osd.0) 20 : cluster [DBG] 4.17 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:42.606999+0000 osd.0 (osd.0) 21 : cluster [DBG] 4.17 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57950208 unmapped: 1761280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:13.797524+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 1744896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:14.797713+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 1712128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 351010 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:15.797870+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 23 sent 21 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:45.625090+0000 osd.0 (osd.0) 22 : cluster [DBG] 4.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:45.638610+0000 osd.0 (osd.0) 23 : cluster [DBG] 4.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 23) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:45.625090+0000 osd.0 (osd.0) 22 : cluster [DBG] 4.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:45.638610+0000 osd.0 (osd.0) 23 : cluster [DBG] 4.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 1703936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:16.798113+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 25 sent 23 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:46.609836+0000 osd.0 (osd.0) 24 : cluster [DBG] 4.1d scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:46.623778+0000 osd.0 (osd.0) 25 : cluster [DBG] 4.1d scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 25) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:46.609836+0000 osd.0 (osd.0) 24 : cluster [DBG] 4.1d scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:46.623778+0000 osd.0 (osd.0) 25 : cluster [DBG] 4.1d scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 1703936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:17.798380+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:18.798545+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 1695744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:19.798757+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 352158 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:20.798922+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:21.799117+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 1662976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.018917084s of 15.051235199s, submitted: 10
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:22.799329+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 27 sent 25 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:52.603678+0000 osd.0 (osd.0) 26 : cluster [DBG] 4.1e scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:52.617707+0000 osd.0 (osd.0) 27 : cluster [DBG] 4.1e scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 27) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:52.603678+0000 osd.0 (osd.0) 26 : cluster [DBG] 4.1e scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:52.617707+0000 osd.0 (osd.0) 27 : cluster [DBG] 4.1e scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 1654784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:23.799599+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 1638400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:24.799767+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 29 sent 27 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:54.572790+0000 osd.0 (osd.0) 28 : cluster [DBG] 4.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:54.586717+0000 osd.0 (osd.0) 29 : cluster [DBG] 4.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 29) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:54.572790+0000 osd.0 (osd.0) 28 : cluster [DBG] 4.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:54.586717+0000 osd.0 (osd.0) 29 : cluster [DBG] 4.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58097664 unmapped: 1613824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 355601 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:25.800010+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 31 sent 29 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:55.620475+0000 osd.0 (osd.0) 30 : cluster [DBG] 6.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:55.634343+0000 osd.0 (osd.0) 31 : cluster [DBG] 6.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 31) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:55.620475+0000 osd.0 (osd.0) 30 : cluster [DBG] 6.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:55.634343+0000 osd.0 (osd.0) 31 : cluster [DBG] 6.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 1597440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:26.800377+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 33 sent 31 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:56.610938+0000 osd.0 (osd.0) 32 : cluster [DBG] 6.5 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:56.624774+0000 osd.0 (osd.0) 33 : cluster [DBG] 6.5 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 33) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:56.610938+0000 osd.0 (osd.0) 32 : cluster [DBG] 6.5 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:56.624774+0000 osd.0 (osd.0) 33 : cluster [DBG] 6.5 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 1597440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:27.800608+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 35 sent 33 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:57.589495+0000 osd.0 (osd.0) 34 : cluster [DBG] 6.7 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:15:57.607110+0000 osd.0 (osd.0) 35 : cluster [DBG] 6.7 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 35) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:57.589495+0000 osd.0 (osd.0) 34 : cluster [DBG] 6.7 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:15:57.607110+0000 osd.0 (osd.0) 35 : cluster [DBG] 6.7 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58122240 unmapped: 1589248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:28.800875+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58122240 unmapped: 1589248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:29.801126+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 1564672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 357895 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:30.801334+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 1564672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:31.801521+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 37 sent 35 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:01.537470+0000 osd.0 (osd.0) 36 : cluster [DBG] 6.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:01.551511+0000 osd.0 (osd.0) 37 : cluster [DBG] 6.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 37) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:01.537470+0000 osd.0 (osd.0) 36 : cluster [DBG] 6.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:01.551511+0000 osd.0 (osd.0) 37 : cluster [DBG] 6.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58155008 unmapped: 1556480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:32.801808+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58155008 unmapped: 1556480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:33.802011+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.a scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.884596825s of 11.934671402s, submitted: 12
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.a scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58155008 unmapped: 1556480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:34.802180+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 39 sent 37 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:04.538569+0000 osd.0 (osd.0) 38 : cluster [DBG] 6.a scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:04.552401+0000 osd.0 (osd.0) 39 : cluster [DBG] 6.a scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 39) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:04.538569+0000 osd.0 (osd.0) 38 : cluster [DBG] 6.a scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:04.552401+0000 osd.0 (osd.0) 39 : cluster [DBG] 6.a scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361337 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:35.802409+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 41 sent 39 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:05.566654+0000 osd.0 (osd.0) 40 : cluster [DBG] 6.10 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:05.594896+0000 osd.0 (osd.0) 41 : cluster [DBG] 6.10 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 41) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:05.566654+0000 osd.0 (osd.0) 40 : cluster [DBG] 6.10 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:05.594896+0000 osd.0 (osd.0) 41 : cluster [DBG] 6.10 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:36.802669+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 43 sent 41 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:06.609155+0000 osd.0 (osd.0) 42 : cluster [DBG] 6.12 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:06.623325+0000 osd.0 (osd.0) 43 : cluster [DBG] 6.12 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 43) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:06.609155+0000 osd.0 (osd.0) 42 : cluster [DBG] 6.12 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:06.623325+0000 osd.0 (osd.0) 43 : cluster [DBG] 6.12 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:37.802921+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 1540096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:38.803188+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:39.803379+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 1531904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:40.803608+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 362485 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 1523712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:41.803764+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 1507328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:42.803943+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 45 sent 43 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:12.630925+0000 osd.0 (osd.0) 44 : cluster [DBG] 6.16 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:12.644971+0000 osd.0 (osd.0) 45 : cluster [DBG] 6.16 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 45) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:12.630925+0000 osd.0 (osd.0) 44 : cluster [DBG] 6.16 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:12.644971+0000 osd.0 (osd.0) 45 : cluster [DBG] 6.16 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 1499136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:43.804190+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 47 sent 45 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:13.614366+0000 osd.0 (osd.0) 46 : cluster [DBG] 6.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:13.628336+0000 osd.0 (osd.0) 47 : cluster [DBG] 6.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 47) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:13.614366+0000 osd.0 (osd.0) 46 : cluster [DBG] 6.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:13.628336+0000 osd.0 (osd.0) 47 : cluster [DBG] 6.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 1490944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:44.804416+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.524544716s of 11.071758270s, submitted: 10
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:45.804585+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 49 sent 47 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:15.610175+0000 osd.0 (osd.0) 48 : cluster [DBG] 6.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:15.624335+0000 osd.0 (osd.0) 49 : cluster [DBG] 6.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 365929 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 49) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:15.610175+0000 osd.0 (osd.0) 48 : cluster [DBG] 6.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:15.624335+0000 osd.0 (osd.0) 49 : cluster [DBG] 6.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:46.804805+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1a deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1a deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:47.805005+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 51 sent 49 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:17.580791+0000 osd.0 (osd.0) 50 : cluster [DBG] 6.1a deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:17.594908+0000 osd.0 (osd.0) 51 : cluster [DBG] 6.1a deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 51) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:17.580791+0000 osd.0 (osd.0) 50 : cluster [DBG] 6.1a deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:17.594908+0000 osd.0 (osd.0) 51 : cluster [DBG] 6.1a deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 1482752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:48.805243+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 1474560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:49.805396+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:50.805521+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:20.546636+0000 osd.0 (osd.0) 52 : cluster [DBG] 6.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:20.560699+0000 osd.0 (osd.0) 53 : cluster [DBG] 6.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368225 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 53) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:20.546636+0000 osd.0 (osd.0) 52 : cluster [DBG] 6.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:20.560699+0000 osd.0 (osd.0) 53 : cluster [DBG] 6.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:51.805765+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:21.552360+0000 osd.0 (osd.0) 54 : cluster [DBG] 5.1e scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:21.566317+0000 osd.0 (osd.0) 55 : cluster [DBG] 5.1e scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 55) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:21.552360+0000 osd.0 (osd.0) 54 : cluster [DBG] 5.1e scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:21.566317+0000 osd.0 (osd.0) 55 : cluster [DBG] 5.1e scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 1474560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:52.806005+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:22.548757+0000 osd.0 (osd.0) 56 : cluster [DBG] 2.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:22.562790+0000 osd.0 (osd.0) 57 : cluster [DBG] 2.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 57) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:22.548757+0000 osd.0 (osd.0) 56 : cluster [DBG] 2.19 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:22.562790+0000 osd.0 (osd.0) 57 : cluster [DBG] 2.19 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 1474560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:53.806249+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:54.806550+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:55.806746+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 370521 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.624811172s of 10.919664383s, submitted: 10
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 1449984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:56.806923+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:26.529908+0000 osd.0 (osd.0) 58 : cluster [DBG] 2.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:26.544023+0000 osd.0 (osd.0) 59 : cluster [DBG] 2.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:57.807207+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 4 last_log 61 sent 59 num 4 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:27.530553+0000 osd.0 (osd.0) 60 : cluster [DBG] 2.16 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:27.544552+0000 osd.0 (osd.0) 61 : cluster [DBG] 2.16 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 59) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:26.529908+0000 osd.0 (osd.0) 58 : cluster [DBG] 2.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:26.544023+0000 osd.0 (osd.0) 59 : cluster [DBG] 2.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:58.807417+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 4 last_log 63 sent 61 num 4 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:28.509925+0000 osd.0 (osd.0) 62 : cluster [DBG] 2.13 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:28.524060+0000 osd.0 (osd.0) 63 : cluster [DBG] 2.13 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 1409024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 61) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:27.530553+0000 osd.0 (osd.0) 60 : cluster [DBG] 2.16 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:27.544552+0000 osd.0 (osd.0) 61 : cluster [DBG] 2.16 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 63) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:28.509925+0000 osd.0 (osd.0) 62 : cluster [DBG] 2.13 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:28.524060+0000 osd.0 (osd.0) 63 : cluster [DBG] 2.13 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:59.807719+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 1400832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:00.807889+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373965 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 1400832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:01.808093+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:02.808273+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:32.470612+0000 osd.0 (osd.0) 64 : cluster [DBG] 5.14 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:32.484512+0000 osd.0 (osd.0) 65 : cluster [DBG] 5.14 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 65) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:32.470612+0000 osd.0 (osd.0) 64 : cluster [DBG] 5.14 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:32.484512+0000 osd.0 (osd.0) 65 : cluster [DBG] 5.14 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:03.808549+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:33.464782+0000 osd.0 (osd.0) 66 : cluster [DBG] 5.15 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:33.478737+0000 osd.0 (osd.0) 67 : cluster [DBG] 5.15 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58327040 unmapped: 1384448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 67) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:33.464782+0000 osd.0 (osd.0) 66 : cluster [DBG] 5.15 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:33.478737+0000 osd.0 (osd.0) 67 : cluster [DBG] 5.15 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:04.808767+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:05.808920+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:35.379257+0000 osd.0 (osd.0) 68 : cluster [DBG] 2.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:35.393099+0000 osd.0 (osd.0) 69 : cluster [DBG] 2.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 377408 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 69) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:35.379257+0000 osd.0 (osd.0) 68 : cluster [DBG] 2.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:35.393099+0000 osd.0 (osd.0) 69 : cluster [DBG] 2.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:06.809150+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:07.809323+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.823574066s of 11.904456139s, submitted: 12
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:08.809476+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:38.434349+0000 osd.0 (osd.0) 70 : cluster [DBG] 5.7 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:38.448558+0000 osd.0 (osd.0) 71 : cluster [DBG] 5.7 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 1327104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 71) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:38.434349+0000 osd.0 (osd.0) 70 : cluster [DBG] 5.7 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:38.448558+0000 osd.0 (osd.0) 71 : cluster [DBG] 5.7 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:09.809734+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:39.408865+0000 osd.0 (osd.0) 72 : cluster [DBG] 5.4 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:39.422853+0000 osd.0 (osd.0) 73 : cluster [DBG] 5.4 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 73) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:39.408865+0000 osd.0 (osd.0) 72 : cluster [DBG] 5.4 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:39.422853+0000 osd.0 (osd.0) 73 : cluster [DBG] 5.4 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:10.809978+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 379702 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:11.810164+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:41.455252+0000 osd.0 (osd.0) 74 : cluster [DBG] 2.11 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:41.469578+0000 osd.0 (osd.0) 75 : cluster [DBG] 2.11 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 75) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:41.455252+0000 osd.0 (osd.0) 74 : cluster [DBG] 2.11 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:41.469578+0000 osd.0 (osd.0) 75 : cluster [DBG] 2.11 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:12.810391+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 1302528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:13.810581+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:43.390779+0000 osd.0 (osd.0) 76 : cluster [DBG] 5.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:43.404338+0000 osd.0 (osd.0) 77 : cluster [DBG] 5.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 1302528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 77) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:43.390779+0000 osd.0 (osd.0) 76 : cluster [DBG] 5.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:43.404338+0000 osd.0 (osd.0) 77 : cluster [DBG] 5.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:14.811112+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:15.811307+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:45.367464+0000 osd.0 (osd.0) 78 : cluster [DBG] 2.8 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:45.381502+0000 osd.0 (osd.0) 79 : cluster [DBG] 2.8 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383144 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 79) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:45.367464+0000 osd.0 (osd.0) 78 : cluster [DBG] 2.8 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:45.381502+0000 osd.0 (osd.0) 79 : cluster [DBG] 2.8 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:16.811530+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 1277952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:17.811702+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 1277952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:18.811864+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.929822922s of 10.976409912s, submitted: 10
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:19.812024+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:49.410835+0000 osd.0 (osd.0) 80 : cluster [DBG] 5.5 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:49.424838+0000 osd.0 (osd.0) 81 : cluster [DBG] 5.5 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 81) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:49.410835+0000 osd.0 (osd.0) 80 : cluster [DBG] 5.5 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:49.424838+0000 osd.0 (osd.0) 81 : cluster [DBG] 5.5 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:20.812331+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 384291 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:21.812586+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:51.399596+0000 osd.0 (osd.0) 82 : cluster [DBG] 2.b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:51.413727+0000 osd.0 (osd.0) 83 : cluster [DBG] 2.b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 83) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:51.399596+0000 osd.0 (osd.0) 82 : cluster [DBG] 2.b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:51.413727+0000 osd.0 (osd.0) 83 : cluster [DBG] 2.b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:22.812793+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:23.812962+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:24.813147+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:54.452981+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.2 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:54.467182+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.2 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 85) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:54.452981+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.2 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:54.467182+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.2 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:25.813448+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386585 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:26.813624+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 1196032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:27.813808+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:57.399427+0000 osd.0 (osd.0) 86 : cluster [DBG] 2.1d scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:57.413428+0000 osd.0 (osd.0) 87 : cluster [DBG] 2.1d scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 87) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:57.399427+0000 osd.0 (osd.0) 86 : cluster [DBG] 2.1d scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:57.413428+0000 osd.0 (osd.0) 87 : cluster [DBG] 2.1d scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 1196032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:28.814014+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 1187840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:29.814160+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:59.374774+0000 osd.0 (osd.0) 88 : cluster [DBG] 2.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:59.388931+0000 osd.0 (osd.0) 89 : cluster [DBG] 2.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 89) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:59.374774+0000 osd.0 (osd.0) 88 : cluster [DBG] 2.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:59.388931+0000 osd.0 (osd.0) 89 : cluster [DBG] 2.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:30.814481+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388881 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:31.814665+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.916635513s of 12.950966835s, submitted: 10
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:32.814881+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:02.361938+0000 osd.0 (osd.0) 90 : cluster [DBG] 2.1c scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:02.375966+0000 osd.0 (osd.0) 91 : cluster [DBG] 2.1c scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 91) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:02.361938+0000 osd.0 (osd.0) 90 : cluster [DBG] 2.1c scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:02.375966+0000 osd.0 (osd.0) 91 : cluster [DBG] 2.1c scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:33.815142+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:34.815368+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:35.815538+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 390029 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:36.815695+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:37.815857+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:07.374384+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.17 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:07.388451+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.17 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 93) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:07.374384+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.17 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:07.388451+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.17 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:38.816085+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:08.349892+0000 osd.0 (osd.0) 94 : cluster [DBG] 7.13 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:08.363839+0000 osd.0 (osd.0) 95 : cluster [DBG] 7.13 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 95) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:08.349892+0000 osd.0 (osd.0) 94 : cluster [DBG] 7.13 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:08.363839+0000 osd.0 (osd.0) 95 : cluster [DBG] 7.13 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:39.816356+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:40.816535+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 392325 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:41.816716+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:42.816875+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:12.305964+0000 osd.0 (osd.0) 96 : cluster [DBG] 3.15 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:12.319947+0000 osd.0 (osd.0) 97 : cluster [DBG] 3.15 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 97) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:12.305964+0000 osd.0 (osd.0) 96 : cluster [DBG] 3.15 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:12.319947+0000 osd.0 (osd.0) 97 : cluster [DBG] 3.15 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:43.817149+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:44.817385+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:45.817539+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 393473 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:46.817689+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:47.817854+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.937739372s of 15.965059280s, submitted: 8
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:48.818019+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:18.326960+0000 osd.0 (osd.0) 98 : cluster [DBG] 3.12 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:18.341083+0000 osd.0 (osd.0) 99 : cluster [DBG] 3.12 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 99) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:18.326960+0000 osd.0 (osd.0) 98 : cluster [DBG] 3.12 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:18.341083+0000 osd.0 (osd.0) 99 : cluster [DBG] 3.12 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 1048576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:49.818242+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:19.336348+0000 osd.0 (osd.0) 100 : cluster [DBG] 3.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:19.350308+0000 osd.0 (osd.0) 101 : cluster [DBG] 3.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 101) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:19.336348+0000 osd.0 (osd.0) 100 : cluster [DBG] 3.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:19.350308+0000 osd.0 (osd.0) 101 : cluster [DBG] 3.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 1048576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:50.818679+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 395768 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:51.818896+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:52.819054+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:53.819237+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:23.269210+0000 osd.0 (osd.0) 102 : cluster [DBG] 7.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:23.283279+0000 osd.0 (osd.0) 103 : cluster [DBG] 7.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 103) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:23.269210+0000 osd.0 (osd.0) 102 : cluster [DBG] 7.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:23.283279+0000 osd.0 (osd.0) 103 : cluster [DBG] 7.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:54.819832+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:24.296815+0000 osd.0 (osd.0) 104 : cluster [DBG] 3.c deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:24.310896+0000 osd.0 (osd.0) 105 : cluster [DBG] 3.c deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 105) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:24.296815+0000 osd.0 (osd.0) 104 : cluster [DBG] 3.c deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:24.310896+0000 osd.0 (osd.0) 105 : cluster [DBG] 3.c deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:55.820150+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398062 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:56.820311+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:57.820462+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:27.332240+0000 osd.0 (osd.0) 106 : cluster [DBG] 7.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:27.346334+0000 osd.0 (osd.0) 107 : cluster [DBG] 7.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 107) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:27.332240+0000 osd.0 (osd.0) 106 : cluster [DBG] 7.f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:27.346334+0000 osd.0 (osd.0) 107 : cluster [DBG] 7.f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:58.820687+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.009760857s of 11.044724464s, submitted: 10
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:59.820871+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:29.371822+0000 osd.0 (osd.0) 108 : cluster [DBG] 7.6 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:29.385727+0000 osd.0 (osd.0) 109 : cluster [DBG] 7.6 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 109) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:29.371822+0000 osd.0 (osd.0) 108 : cluster [DBG] 7.6 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:29.385727+0000 osd.0 (osd.0) 109 : cluster [DBG] 7.6 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:00.821181+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400356 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:01.821359+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:02.821507+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:32.408675+0000 osd.0 (osd.0) 110 : cluster [DBG] 3.6 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:32.422674+0000 osd.0 (osd.0) 111 : cluster [DBG] 3.6 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 111) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:32.408675+0000 osd.0 (osd.0) 110 : cluster [DBG] 3.6 deep-scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:32.422674+0000 osd.0 (osd.0) 111 : cluster [DBG] 3.6 deep-scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:03.821723+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:33.415355+0000 osd.0 (osd.0) 112 : cluster [DBG] 7.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:33.429358+0000 osd.0 (osd.0) 113 : cluster [DBG] 7.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 113) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:33.415355+0000 osd.0 (osd.0) 112 : cluster [DBG] 7.3 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:33.429358+0000 osd.0 (osd.0) 113 : cluster [DBG] 7.3 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:04.822116+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 115 sent 113 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:34.387045+0000 osd.0 (osd.0) 114 : cluster [DBG] 3.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:34.401217+0000 osd.0 (osd.0) 115 : cluster [DBG] 3.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 115) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:34.387045+0000 osd.0 (osd.0) 114 : cluster [DBG] 3.9 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:34.401217+0000 osd.0 (osd.0) 115 : cluster [DBG] 3.9 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:05.822362+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 117 sent 115 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:35.414919+0000 osd.0 (osd.0) 116 : cluster [DBG] 3.a scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:35.428947+0000 osd.0 (osd.0) 117 : cluster [DBG] 3.a scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 117) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:35.414919+0000 osd.0 (osd.0) 116 : cluster [DBG] 3.a scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:35.428947+0000 osd.0 (osd.0) 117 : cluster [DBG] 3.a scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 404944 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:06.822623+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:07.822823+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 119 sent 117 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:37.457522+0000 osd.0 (osd.0) 118 : cluster [DBG] 7.4 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:37.472254+0000 osd.0 (osd.0) 119 : cluster [DBG] 7.4 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 119) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:37.457522+0000 osd.0 (osd.0) 118 : cluster [DBG] 7.4 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:37.472254+0000 osd.0 (osd.0) 119 : cluster [DBG] 7.4 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:08.823354+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:09.823571+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:10.823772+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 406091 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:11.823935+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:12.824099+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:13.824283+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.002301216s of 15.046391487s, submitted: 12
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:14.824517+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 121 sent 119 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:44.418125+0000 osd.0 (osd.0) 120 : cluster [DBG] 3.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:44.432323+0000 osd.0 (osd.0) 121 : cluster [DBG] 3.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 121) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:44.418125+0000 osd.0 (osd.0) 120 : cluster [DBG] 3.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:44.432323+0000 osd.0 (osd.0) 121 : cluster [DBG] 3.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:15.824775+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 123 sent 121 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:45.428356+0000 osd.0 (osd.0) 122 : cluster [DBG] 7.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:45.442452+0000 osd.0 (osd.0) 123 : cluster [DBG] 7.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 408387 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 123) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:45.428356+0000 osd.0 (osd.0) 122 : cluster [DBG] 7.18 scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:45.442452+0000 osd.0 (osd.0) 123 : cluster [DBG] 7.18 scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:16.825143+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 125 sent 123 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:46.403953+0000 osd.0 (osd.0) 124 : cluster [DBG] 7.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:46.418078+0000 osd.0 (osd.0) 125 : cluster [DBG] 7.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 125) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:46.403953+0000 osd.0 (osd.0) 124 : cluster [DBG] 7.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:46.418078+0000 osd.0 (osd.0) 125 : cluster [DBG] 7.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:17.825531+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 127 sent 125 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:47.388794+0000 osd.0 (osd.0) 126 : cluster [DBG] 3.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:47.402875+0000 osd.0 (osd.0) 127 : cluster [DBG] 3.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 127) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:47.388794+0000 osd.0 (osd.0) 126 : cluster [DBG] 3.1f scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:47.402875+0000 osd.0 (osd.0) 127 : cluster [DBG] 3.1f scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:18.825848+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:19.825989+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 129 sent 127 num 2 unsent 2 sending 2
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:49.384350+0000 osd.0 (osd.0) 128 : cluster [DBG] 7.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:49.398826+0000 osd.0 (osd.0) 129 : cluster [DBG] 7.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 129) v1
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:49.384350+0000 osd.0 (osd.0) 128 : cluster [DBG] 7.1b scrub starts
Dec 01 09:39:37 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:49.398826+0000 osd.0 (osd.0) 129 : cluster [DBG] 7.1b scrub ok
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:20.826188+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:21.826335+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:22.826476+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:23.826596+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:24.826781+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58957824 unmapped: 753664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:25.827085+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:26.827539+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:27.827759+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:28.828149+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:29.828310+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:30.828452+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:31.828654+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:32.829193+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:33.829771+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:34.830129+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:35.830466+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:36.831076+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:37.831505+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:38.831871+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:39.832089+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:40.832519+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:41.832789+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:42.833208+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:43.833439+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:44.833729+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:45.834473+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:46.834946+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:47.835371+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:48.835884+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:49.836348+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:50.836558+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:51.836764+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:52.836936+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:53.837495+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:54.837805+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:55.837953+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:56.838251+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:57.838411+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:58.838750+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:59.839032+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:00.839275+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:01.839606+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:02.839835+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:03.840047+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:04.840266+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:05.840585+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:06.840741+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:07.841056+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:08.841405+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:09.841670+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:10.841929+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:11.842134+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:12.842273+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:13.842453+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:14.842712+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:15.842918+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:16.843069+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:17.843218+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:18.843445+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:19.843687+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:20.843851+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:21.844007+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:22.844178+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:23.844373+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:24.844554+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:25.844806+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:26.844985+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:27.845165+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:28.845416+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:29.845567+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:30.845720+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:31.845893+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:32.846071+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:33.846210+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:34.846450+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:35.846644+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:36.846994+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:37.847179+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:38.847408+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:39.847585+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:40.847934+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:41.848102+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:42.848348+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:43.848592+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:44.848802+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:45.848971+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:46.849166+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:47.849407+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:48.849710+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:49.849839+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:50.850006+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:51.850171+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:52.850335+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:53.850570+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:54.850796+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:55.851007+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:56.851240+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:57.851484+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:58.851658+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:59.851814+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:00.852007+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:01.852208+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:02.852415+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:03.852563+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:04.852770+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:05.852922+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:06.853136+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:07.853363+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:08.853615+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:09.853777+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:10.853992+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:11.854176+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:12.854361+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:13.854561+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:14.854771+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:15.854906+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:16.855059+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:17.855217+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:18.855388+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:19.855561+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:20.855728+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:21.855880+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:22.856074+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:23.856380+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:24.857126+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:25.857284+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:26.857473+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:27.857646+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:28.857814+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:29.857977+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:30.858188+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:31.858358+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:32.858584+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:33.858754+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:34.858935+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:35.859149+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:36.859413+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:37.859568+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:38.859702+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:39.859899+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:40.860046+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:41.860193+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:42.860519+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:43.860758+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:44.860949+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:45.861107+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:46.861285+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:47.861428+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:48.861570+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:49.861719+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:50.861910+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:51.862136+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:52.862309+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:53.862451+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:54.862689+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:55.862878+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:56.863056+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:57.863245+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:58.863486+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:59.863702+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:00.863863+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:01.864008+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:02.864153+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:03.864407+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:04.864608+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:05.864769+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:06.864932+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:07.865102+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:08.865408+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:09.865605+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:10.865840+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:11.866058+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:12.866361+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:13.866525+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:14.866719+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:15.866892+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:16.867070+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:17.867361+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:18.867532+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:19.867698+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:20.867890+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:21.868058+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:22.868233+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:23.868541+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:24.868844+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:25.869136+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:26.869392+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:27.869683+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:28.869906+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:29.870090+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:30.870367+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:31.870645+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:32.870838+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:33.870991+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:34.871168+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:35.871326+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:36.871599+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:37.871847+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:38.872094+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:39.872366+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:40.872605+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:41.872813+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:42.872984+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:43.873226+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:44.873556+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:45.873696+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:46.873920+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:47.874102+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:48.874255+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:49.874376+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:50.874609+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:51.874764+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:52.875142+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:53.875371+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:54.875898+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:55.876055+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:56.876210+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:57.876445+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:58.876646+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:59.876886+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:00.877121+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:01.877399+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:02.877667+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:03.877889+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:04.878329+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:05.878670+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:06.879122+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:07.879508+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:08.879684+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:09.879819+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:10.879960+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:11.880121+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:12.880256+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:13.880388+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:14.880611+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:15.880835+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:16.881027+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:17.883666+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:18.886024+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:19.887976+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:20.888787+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:21.889761+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:22.889951+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:23.890140+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:24.890479+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:25.890619+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:26.890783+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:27.891071+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:28.891209+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:29.891451+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:30.891665+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:31.891814+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:32.891994+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:33.892211+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:34.892441+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:35.892617+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:36.892766+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:37.892929+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:38.893098+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:39.893255+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:40.893378+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:41.893701+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:42.893837+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:43.894086+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:44.894400+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:45.894632+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:46.894919+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:47.895030+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:48.895211+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:49.895407+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:50.895542+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:51.895743+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:52.895892+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:53.896059+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:54.896268+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:55.896450+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:56.896565+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:57.896707+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:58.896860+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:59.897016+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:00.897205+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:01.897419+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:02.897549+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:03.897710+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:04.897995+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:05.898172+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:06.898325+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:07.898456+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:08.898577+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:09.898713+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:10.898833+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:11.899016+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:12.899215+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:13.899471+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:14.899730+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:15.899886+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:16.900058+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:17.900212+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:18.900457+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:19.900675+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:20.900804+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:21.900951+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:22.901133+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:23.901247+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:24.901339+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:25.901485+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:26.901616+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:27.901793+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:28.901922+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:29.902397+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:30.902532+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:31.902661+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:32.902752+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:33.902876+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:34.903053+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:35.903190+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:36.903317+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:37.903454+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:38.903598+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:39.903756+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:40.903925+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:41.904101+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:42.904220+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:43.904404+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:44.904592+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:45.904810+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:46.904921+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:47.904991+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:48.905140+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:49.905273+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:50.905498+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:51.905669+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:52.905832+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:53.906052+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:54.906216+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:55.906387+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:56.906572+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:57.906673+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:58.906830+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:59.907068+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:00.907248+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:01.907408+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:02.907623+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:03.907801+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:04.908018+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:05.908158+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:06.908334+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:07.908500+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:08.908640+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:09.908775+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:10.908981+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:11.909101+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:12.909248+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:13.909354+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:14.909548+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:15.909778+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:16.910500+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:17.910680+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:18.910871+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:19.911008+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:20.911154+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:21.911312+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:22.911467+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:23.911694+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:24.911888+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:25.912047+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:26.912200+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:27.912362+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:28.912490+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:29.912691+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:30.912847+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:31.912985+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:32.913132+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:33.913485+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:34.913751+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:35.913863+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:36.913980+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:37.914102+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:38.914266+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:39.914436+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 15.93 MB, 0.03 MB/s
                                           Interval WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:40.914568+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60334080 unmapped: 425984 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:41.914665+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60334080 unmapped: 425984 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:42.914834+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60342272 unmapped: 417792 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:43.914989+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60342272 unmapped: 417792 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:44.915159+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60342272 unmapped: 417792 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:45.915300+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60350464 unmapped: 409600 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:46.915413+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60350464 unmapped: 409600 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:47.915538+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60350464 unmapped: 409600 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:48.915684+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60358656 unmapped: 401408 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:49.915877+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60358656 unmapped: 401408 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:50.916132+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60366848 unmapped: 393216 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:51.916312+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60366848 unmapped: 393216 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:52.916536+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60375040 unmapped: 385024 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:53.916769+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:54.916944+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:55.917088+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:56.917251+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:57.917369+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:58.917519+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:59.917681+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:00.917841+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:01.918019+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:02.918129+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:03.918276+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:04.918521+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:05.918728+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:06.918925+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:07.919124+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:08.919328+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:09.919424+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:10.919540+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:11.919744+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:12.919890+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:13.920429+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:14.920694+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:15.920810+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:16.921012+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:17.921155+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:18.921323+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:19.921461+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:20.921612+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:21.921770+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:22.921927+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:23.922096+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:24.922625+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:25.922817+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:26.922937+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:27.923130+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:28.923312+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:29.923453+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:30.923612+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:31.923835+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:32.923980+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:33.924150+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:34.924342+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:35.924478+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:36.924672+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:37.924823+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:38.924954+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:39.925086+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:40.925232+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:41.925372+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:42.925532+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:43.925683+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:44.925866+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:45.926022+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:46.926180+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:47.926367+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:48.926497+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:49.926638+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:50.926797+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:51.926921+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:52.927070+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:53.927275+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:54.927476+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:55.927685+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:56.927921+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:57.928076+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:58.928248+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:59.928434+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:00.928592+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:01.928796+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:02.929012+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:03.929182+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:04.929388+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:05.929529+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:06.929679+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:07.929834+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:08.929917+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:09.930077+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:10.930198+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:11.930363+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:12.930519+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:13.930666+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:14.930890+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:15.931027+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:16.931214+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:17.931374+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:18.931526+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:19.931669+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:20.931790+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:21.931976+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:22.932122+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:23.932250+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:24.932432+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:25.932578+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:26.932727+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:27.932851+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:28.932962+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:29.933065+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:30.933173+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:31.933326+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:32.933465+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:33.933605+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:34.933812+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:35.933945+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:36.934119+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:37.934256+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:38.934409+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:39.934567+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:40.934752+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:41.934924+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:42.935105+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:43.935270+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:44.935577+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:45.935939+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:46.936237+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:47.936399+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:48.936608+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:49.936817+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:50.936999+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:51.937170+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:52.937421+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:53.937575+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:54.937767+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:55.937977+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:56.938150+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:57.938373+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:58.938545+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:59.938718+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:00.938879+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:01.939005+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:02.939186+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:03.939356+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:04.939514+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:05.939702+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:06.939930+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:07.940098+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:08.940220+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:09.940390+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:10.940535+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:11.940686+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:12.940837+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:13.941014+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:14.941190+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:15.941334+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:16.941456+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:17.941589+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:18.941711+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:19.941838+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:20.941975+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:21.942106+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:22.942246+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:23.942353+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:24.942532+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:25.942677+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:26.942838+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:27.943009+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:28.943154+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:29.943328+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:30.943446+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:31.943567+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:32.943683+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:33.943815+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:34.943977+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:35.944102+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:36.944226+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:37.944372+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:38.944487+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:39.944625+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:40.944777+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:41.944987+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:42.945145+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:43.945277+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:44.945424+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:45.945609+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:46.945758+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:47.945913+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:48.946035+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:49.946154+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:50.946310+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:51.946439+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:52.946566+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:53.946709+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:54.946868+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:55.947001+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:56.947114+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:57.947310+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:58.947441+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:59.947583+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:00.947764+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:01.954104+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:02.954261+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:03.954343+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:04.954520+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:05.954702+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:06.954890+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:07.955090+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:08.955265+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:09.955451+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:10.955565+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:11.955781+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:12.955909+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:13.956035+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:14.956186+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:15.956330+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:16.956442+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:17.956632+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:18.956824+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:19.956969+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:20.957145+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:21.957376+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:22.957514+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:23.957638+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:24.957803+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:25.957932+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:26.958062+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:27.958201+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:28.958366+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:29.958559+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:30.958796+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:31.958969+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:32.959167+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:33.959367+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:34.959563+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:35.959705+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:36.959862+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:37.960104+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:38.960284+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:39.960441+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:40.960548+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:41.960714+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:42.960858+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:43.961028+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:44.961232+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:45.961376+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:46.961513+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:47.961694+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:48.961923+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:49.962070+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:50.962188+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:51.963098+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:52.963338+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:53.963830+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:54.964004+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:55.964144+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:56.964318+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:57.964540+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:58.964692+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:59.964876+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:00.965027+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:01.965208+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:02.965380+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:03.965507+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:04.965768+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:05.966024+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:06.966209+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:07.966434+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:08.966559+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:09.966729+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:10.966869+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:11.967037+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:12.967224+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:13.967429+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:14.967650+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:15.967889+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:16.968062+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:17.968189+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:18.968355+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:19.968574+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:20.968741+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:21.968914+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:22.969113+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:23.969257+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:24.969488+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:25.969644+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:26.969806+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:27.969927+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:28.970154+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:29.970410+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:30.970559+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:31.970744+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:32.971236+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:33.971369+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:34.971554+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:35.971741+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:36.971921+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:37.972117+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:38.972320+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:39.972518+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:40.972641+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:41.972821+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:42.972970+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:43.973097+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:44.973263+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:45.973418+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:46.973738+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:47.973896+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:48.974025+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:49.974164+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: mgrc ms_handle_reset ms_handle_reset con 0x55c737203c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec 01 09:39:37 compute-0 ceph-osd[88047]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: get_auth_request con 0x55c739d2a000 auth_method 0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: mgrc handle_mgr_configure stats_period=5
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:50.974620+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:51.974806+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:52.974952+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:53.975184+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:54.975394+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:55.975668+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:56.975817+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:57.976015+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 ms_handle_reset con 0x55c737443400 session 0x55c7371974a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900b400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:58.976197+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:59.976334+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:00.976453+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:01.976607+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:02.976813+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:03.977004+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:04.977179+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:05.977370+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:06.977505+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:07.977661+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:08.977803+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:09.977959+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:10.978086+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:11.978260+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:12.978417+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:13.978575+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:14.978779+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:15.978992+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:16.979129+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:17.979323+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:18.979496+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:19.979713+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:20.979886+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:21.980214+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:22.980476+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:23.980640+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:24.980840+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:25.980995+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:26.981154+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:27.981334+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:28.981486+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:29.981744+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:30.981918+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:31.982052+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:32.982675+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:33.982799+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:34.983006+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:35.983138+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:36.983271+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:37.983410+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:38.983552+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:39.983705+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:40.983844+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:41.984018+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:42.984160+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:43.984325+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:44.984480+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:45.984617+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:46.984749+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:47.984857+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:48.984975+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:49.985083+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:50.985201+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:51.985392+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:52.985536+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:53.985858+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:54.986082+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:55.986983+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:56.987137+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:57.987318+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:58.987490+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:59.987694+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:00.987861+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 ms_handle_reset con 0x55c73900a000 session 0x55c7384cc960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900ac00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:01.988051+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:02.988194+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:03.988385+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:04.988588+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:05.988738+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:06.989032+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:07.989172+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:08.989386+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:09.989508+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:10.989650+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:11.989779+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:12.989939+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:13.990212+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:14.990377+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:15.990551+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:16.990684+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:17.990862+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:18.991063+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:19.991270+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:20.991508+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:21.991644+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:22.991787+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:23.991938+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:24.992131+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:25.992261+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:26.992536+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:27.992830+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:28.992983+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:29.993121+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:30.993258+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:31.993353+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:32.993497+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:33.993637+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:34.993849+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:35.994007+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:36.994127+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:37.994264+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:38.994350+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:39.994514+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:40.994731+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:41.994976+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:42.995131+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:43.995259+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:44.995429+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:45.995557+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:46.995692+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:47.995839+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:48.996015+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:49.996207+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:50.996405+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:51.996548+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:52.996670+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:53.996807+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:54.996987+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:55.997165+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:56.997359+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:57.998869+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:58.999121+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:59.999373+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:00.999489+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:01.999609+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:02.999825+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:03.999965+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:05.000708+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:06.000842+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:07.001067+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:08.001577+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:09.001888+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:10.002027+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:11.002223+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:12.002417+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:13.002812+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:14.003037+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:15.003215+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:16.003688+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:17.003868+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:18.004023+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:19.004179+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:20.004372+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:21.004494+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:22.004633+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:23.004757+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:24.004883+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:25.005073+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:26.005233+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:27.005452+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:28.005619+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:29.005805+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:30.005974+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:31.006077+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:32.012580+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:33.012713+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:34.012823+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:35.013018+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:36.013147+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:37.013303+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:38.013457+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:39.013615+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:40.013777+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:41.013932+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:42.014073+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:43.014269+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:44.014417+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:45.014582+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:46.014758+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:47.014873+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:48.015003+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:49.015166+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:50.015346+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:51.015487+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:52.015639+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:53.015771+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:54.015890+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:55.016080+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:56.016254+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:57.016441+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:58.016618+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:59.016746+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:00.016920+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:01.017082+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:02.017385+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:03.017909+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:04.018111+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:05.018332+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:06.018553+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:07.018758+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:08.018917+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:09.019116+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:10.019312+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:11.019476+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:12.019632+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:13.019809+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:14.019941+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:15.020097+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:16.020431+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:17.020638+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:18.020890+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:19.021176+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:20.021357+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:21.021573+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:22.021861+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:23.022136+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:24.022377+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:25.022752+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:26.023068+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:27.023388+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:28.023965+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:29.024277+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:30.024590+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:31.024808+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:32.025022+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:33.025279+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:34.025683+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:35.026118+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:36.026363+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:37.026569+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:38.026760+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:39.027041+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:40.027330+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:41.027515+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:42.027844+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:43.028057+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:44.028161+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:45.028363+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:46.028499+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:47.028614+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:48.028794+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:49.028975+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:50.029116+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:51.029325+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:52.029437+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:53.029575+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:54.029691+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:55.029858+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:56.029967+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:57.030128+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:58.030282+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:59.030429+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:00.030580+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:01.030769+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:02.030916+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:03.031063+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:04.031225+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:05.031569+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:06.031783+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:07.032002+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:08.032419+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:09.032649+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:10.032999+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:11.033222+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:12.033399+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:13.033574+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:14.033763+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:15.033986+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:16.034208+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:17.034370+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:18.034499+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:19.034604+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:20.034889+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:21.035023+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:22.035156+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:23.035368+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:24.035980+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:25.036178+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:26.036370+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:27.036571+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:28.036739+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:29.036898+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:30.037102+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:31.037364+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:32.037526+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:33.037664+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:34.037839+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:35.038047+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:36.038245+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:37.038402+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:38.038566+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:39.038681+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:40.038897+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:41.039078+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:42.039242+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:43.039423+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:44.039607+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:45.039878+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:46.040192+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:47.040383+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:48.040579+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:49.040812+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:50.041018+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:51.041237+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:52.041499+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:53.041735+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:54.041897+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:55.042083+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:56.042232+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:57.042449+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:58.042632+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:59.042791+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:00.042965+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:01.043095+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:02.043359+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:03.043508+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:04.043736+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:05.043906+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:06.044061+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:07.044187+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:08.044367+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:09.044567+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:10.044855+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:11.045529+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:12.045680+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:13.045840+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:14.045995+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:15.046537+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:16.046700+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:17.046894+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:18.047079+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:19.047375+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:20.047573+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:21.047765+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:22.047964+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:23.048138+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:24.048332+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:25.048540+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:26.048684+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:27.048828+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:28.048980+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:29.049109+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:30.049217+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:31.049353+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:32.049492+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:33.049649+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:34.049833+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:35.050033+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:36.050203+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:37.050405+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:38.050612+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:39.050775+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:40.050945+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:41.051095+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:42.051243+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:43.051425+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:44.051549+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:45.051715+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:46.051888+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:47.052036+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:48.052171+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:49.052488+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:50.052663+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:51.052835+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:52.053045+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:53.053180+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:54.053358+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:55.053569+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:56.053726+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:57.053884+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:58.054054+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:59.054269+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:00.054474+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:01.054639+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:02.054803+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:03.054991+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:04.055137+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:05.055341+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:06.055492+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:07.055720+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:08.055914+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:09.056146+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:10.056418+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:11.056595+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:12.056730+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:13.056851+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:14.057019+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:15.057203+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:16.057398+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:17.057584+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:18.057872+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:19.058143+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:20.058406+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:21.058548+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:22.058680+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:23.058874+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:24.059090+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:25.059389+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:26.059565+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:27.059816+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:28.060001+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:29.060146+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:30.060357+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:31.060507+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:32.060685+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:33.060874+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:34.061020+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:35.061209+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:36.061351+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:37.061623+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:38.061767+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:39.061908+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:40.062073+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:41.062233+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:42.062377+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:43.062514+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:44.062654+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:45.062866+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:46.063077+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:47.063225+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:48.063402+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:49.063571+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 47 handle_osd_map epochs [48,48], i have 47, src has [1,48]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 1115.405761719s of 1115.446777344s, submitted: 10
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:50.063758+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 48 heartbeat osd_stat(store_statfs(0x4fe155000/0x0/0x4ffc00000, data 0x2efc8/0x78000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 48 handle_osd_map epochs [49,49], i have 48, src has [1,49]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 48 handle_osd_map epochs [49,49], i have 49, src has [1,49]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:51.063926+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 418975 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 1073152 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 49 handle_osd_map epochs [50,50], i have 49, src has [1,50]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 50 ms_handle_reset con 0x55c73900a000 session 0x55c739057a40
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:52.064098+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60825600 unmapped: 983040 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:53.064303+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60825600 unmapped: 983040 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 50 handle_osd_map epochs [50,51], i have 50, src has [1,51]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 51 ms_handle_reset con 0x55c73900a400 session 0x55c738b12960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 51 heartbeat osd_stat(store_statfs(0x4fe14c000/0x0/0x4ffc00000, data 0x31bdf/0x80000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:54.064474+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:55.065374+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:56.065591+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 430013 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:57.065775+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:58.065944+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 51 heartbeat osd_stat(store_statfs(0x4fe149000/0x0/0x4ffc00000, data 0x331d8/0x84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:59.066139+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 51 heartbeat osd_stat(store_statfs(0x4fe149000/0x0/0x4ffc00000, data 0x331d8/0x84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:00.066359+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 51 handle_osd_map epochs [51,52], i have 51, src has [1,52]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.248213768s of 10.329307556s, submitted: 15
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60948480 unmapped: 860160 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:01.066559+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60948480 unmapped: 860160 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:02.066773+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60948480 unmapped: 860160 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:03.066969+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:04.067143+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:05.067414+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:06.067603+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:07.067986+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:08.068225+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:09.068419+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:10.068645+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:11.068791+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:12.068983+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:13.069140+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:14.069313+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:15.069466+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:16.069636+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:17.069791+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:18.070013+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:19.070203+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:20.071149+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:21.071906+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:22.072099+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:23.072264+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:24.072697+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:25.073054+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.547399521s of 25.560478210s, submitted: 9
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:26.073374+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 52 handle_osd_map epochs [52,53], i have 52, src has [1,53]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 439721 data_alloc: 218103808 data_used: 24576
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 53 ms_handle_reset con 0x55c737c7bc00 session 0x55c738b12000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61038592 unmapped: 770048 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:27.073590+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 53 ms_handle_reset con 0x55c739d63800 session 0x55c738b13a40
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 53 ms_handle_reset con 0x55c739d63c00 session 0x55c738d850e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61087744 unmapped: 720896 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:28.073799+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 53 heartbeat osd_stat(store_statfs(0x4fe141000/0x0/0x4ffc00000, data 0x36055/0x8c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61087744 unmapped: 720896 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:29.073996+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 53 handle_osd_map epochs [54,54], i have 53, src has [1,54]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 54 ms_handle_reset con 0x55c737c7bc00 session 0x55c738d84000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61333504 unmapped: 475136 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 54 ms_handle_reset con 0x55c73900a000 session 0x55c738adbe00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 54 ms_handle_reset con 0x55c73900a400 session 0x55c738ada780
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:30.074160+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61358080 unmapped: 450560 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:31.074361+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 54 heartbeat osd_stat(store_statfs(0x4fe13b000/0x0/0x4ffc00000, data 0x37a55/0x92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 54 handle_osd_map epochs [55,55], i have 54, src has [1,55]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 450304 data_alloc: 218103808 data_used: 40960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 55 ms_handle_reset con 0x55c739d63800 session 0x55c737196f00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62545920 unmapped: 311296 heap: 62857216 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:32.074609+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db2800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 55 heartbeat osd_stat(store_statfs(0x4fe138000/0x0/0x4ffc00000, data 0x39040/0x94000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62554112 unmapped: 303104 heap: 62857216 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:33.075217+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 56 ms_handle_reset con 0x55c739db2800 session 0x55c738612960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62513152 unmapped: 1392640 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:34.075583+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 56 heartbeat osd_stat(store_statfs(0x4fe13a000/0x0/0x4ffc00000, data 0x39e08/0x93000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62513152 unmapped: 1392640 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:35.075760+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 56 handle_osd_map epochs [57,57], i have 56, src has [1,57]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 57 ms_handle_reset con 0x55c737c7bc00 session 0x55c738648b40
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62545920 unmapped: 1359872 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:36.075912+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 458059 data_alloc: 218103808 data_used: 40960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62562304 unmapped: 1343488 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:37.076072+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.251831055s of 11.507549286s, submitted: 61
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 57 ms_handle_reset con 0x55c73900a400 session 0x55c7386481e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62603264 unmapped: 1302528 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 57 handle_osd_map epochs [58,58], i have 57, src has [1,58]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:38.076201+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 58 handle_osd_map epochs [59,59], i have 58, src has [1,59]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 59 ms_handle_reset con 0x55c73900a000 session 0x55c738adbe00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62996480 unmapped: 17694720 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:39.076542+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 59 heartbeat osd_stat(store_statfs(0x4fc92c000/0x0/0x4ffc00000, data 0x183e90e/0x18a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 59 handle_osd_map epochs [59,60], i have 59, src has [1,60]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 59 handle_osd_map epochs [60,60], i have 60, src has [1,60]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 60 ms_handle_reset con 0x55c739db3c00 session 0x55c738ada960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 60 ms_handle_reset con 0x55c739d63800 session 0x55c7371974a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63070208 unmapped: 17620992 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 60 heartbeat osd_stat(store_statfs(0x4fc928000/0x0/0x4ffc00000, data 0x183ff1a/0x18a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:40.076694+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 60 handle_osd_map epochs [61,61], i have 60, src has [1,61]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 61 ms_handle_reset con 0x55c737c7bc00 session 0x55c739090000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63135744 unmapped: 17555456 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:41.076863+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 61 ms_handle_reset con 0x55c73900a400 session 0x55c7390910e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 649667 data_alloc: 218103808 data_used: 57344
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 62 ms_handle_reset con 0x55c73900a000 session 0x55c7389ad680
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63184896 unmapped: 17506304 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:42.077056+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 63 ms_handle_reset con 0x55c739db3c00 session 0x55c7390905a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 63 ms_handle_reset con 0x55c739db3800 session 0x55c7389ac3c0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63225856 unmapped: 17465344 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:43.077203+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 63 heartbeat osd_stat(store_statfs(0x4fe11e000/0x0/0x4ffc00000, data 0x4438b/0xaf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 63 handle_osd_map epochs [64,64], i have 64, src has [1,64]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63234048 unmapped: 17457152 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:44.077346+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 64 heartbeat osd_stat(store_statfs(0x4fe11a000/0x0/0x4ffc00000, data 0x45971/0xb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 64 handle_osd_map epochs [65,65], i have 65, src has [1,65]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63275008 unmapped: 17416192 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:45.077510+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 65 handle_osd_map epochs [65,66], i have 65, src has [1,66]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 66 heartbeat osd_stat(store_statfs(0x4fe116000/0x0/0x4ffc00000, data 0x46fab/0xb5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 66 ms_handle_reset con 0x55c737c7bc00 session 0x55c739090d20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 17375232 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:46.077668+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 499959 data_alloc: 218103808 data_used: 122880
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 67 ms_handle_reset con 0x55c73900a000 session 0x55c7389c92c0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63348736 unmapped: 17342464 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:47.077808+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.564005852s of 10.165904045s, submitted: 115
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63397888 unmapped: 17293312 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 68 ms_handle_reset con 0x55c73900a400 session 0x55c7389a6d20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:48.078011+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 69 ms_handle_reset con 0x55c739db3c00 session 0x55c738ada5a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 69 ms_handle_reset con 0x55c739db3400 session 0x55c7389a74a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 17162240 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:49.078258+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 70 ms_handle_reset con 0x55c737c7bc00 session 0x55c738ada780
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 70 ms_handle_reset con 0x55c73900a000 session 0x55c7371974a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 15974400 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:50.078495+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 70 handle_osd_map epochs [70,71], i have 70, src has [1,71]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 71 ms_handle_reset con 0x55c73900a400 session 0x55c7384d9c20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fe10a000/0x0/0x4ffc00000, data 0x4ea1e/0xc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:51.078685+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fe10a000/0x0/0x4ffc00000, data 0x4ea1e/0xc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 513448 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:52.078854+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:53.079000+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:54.079143+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fe10a000/0x0/0x4ffc00000, data 0x4ea1e/0xc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 71 handle_osd_map epochs [71,72], i have 71, src has [1,72]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64806912 unmapped: 15884288 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:55.079331+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 72 handle_osd_map epochs [73,73], i have 72, src has [1,73]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 73 ms_handle_reset con 0x55c739db3c00 session 0x55c7389a70e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 13737984 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:56.079450+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 521811 data_alloc: 218103808 data_used: 131072
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 13737984 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:57.079631+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 13721600 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:58.079779+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 13721600 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 73 heartbeat osd_stat(store_statfs(0x4fe105000/0x0/0x4ffc00000, data 0x51719/0xc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:59.079934+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 73 handle_osd_map epochs [73,74], i have 73, src has [1,74]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.513625145s of 12.047250748s, submitted: 150
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 13713408 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:00.080067+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c739db3000 session 0x55c7389a7680
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 13828096 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:01.080274+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 529152 data_alloc: 218103808 data_used: 143360
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c737c7bc00 session 0x55c7389a7e00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 13819904 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:02.080519+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c73900a000 session 0x55c738649e00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c73900a400 session 0x55c738649860
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 13819904 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:03.080677+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 13729792 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 77 ms_handle_reset con 0x55c739db3c00 session 0x55c738649680
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:04.080815+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 77 handle_osd_map epochs [77,78], i have 77, src has [1,78]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db2c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 78 ms_handle_reset con 0x55c739db2c00 session 0x55c737197860
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67018752 unmapped: 13672448 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 78 heartbeat osd_stat(store_statfs(0x4fdce9000/0x0/0x4ffc00000, data 0x56f89/0xd3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:05.081035+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 13623296 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:06.081177+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546256 data_alloc: 218103808 data_used: 143360
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 13623296 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:07.081339+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 heartbeat osd_stat(store_statfs(0x4fdce0000/0x0/0x4ffc00000, data 0x59a92/0xda000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 13623296 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c737c7bc00 session 0x55c739090d20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a000 session 0x55c737a0c780
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a400 session 0x55c7389ac5a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:08.081584+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db2c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739db2c00 session 0x55c7384cda40
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e17400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739e17400 session 0x55c7386130e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739db3c00 session 0x55c7384cd680
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c737c7bc00 session 0x55c7384cc000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739000400 session 0x55c7390901e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a400 session 0x55c7390910e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 13598720 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a000 session 0x55c738a9bc20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:09.081764+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c737c7bc00 session 0x55c738a9b4a0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 13598720 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739000400 session 0x55c738a9be00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:10.081978+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.614140511s of 10.876181602s, submitted: 69
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 80 ms_handle_reset con 0x55c73900a400 session 0x55c7384d9c20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 80 ms_handle_reset con 0x55c739db3c00 session 0x55c7384d8960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 13500416 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:11.082101+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 552791 data_alloc: 218103808 data_used: 159744
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 80 heartbeat osd_stat(store_statfs(0x4fdcde000/0x0/0x4ffc00000, data 0x5af8a/0xdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:12.082219+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:13.082363+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:14.082533+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 80 heartbeat osd_stat(store_statfs(0x4fdcde000/0x0/0x4ffc00000, data 0x5af8a/0xdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:15.082742+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 80 ms_handle_reset con 0x55c739e16000 session 0x55c7384d92c0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 13336576 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:16.082915+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 80 handle_osd_map epochs [80,81], i have 80, src has [1,81]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c739e16000 session 0x55c7384d9c20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c739000400 session 0x55c738d84f00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c737c7bc00 session 0x55c7384d92c0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c73900a400 session 0x55c738a9bc20
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 559232 data_alloc: 218103808 data_used: 159744
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:17.083231+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 81 handle_osd_map epochs [81,82], i have 81, src has [1,82]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 82 ms_handle_reset con 0x55c739db3c00 session 0x55c738a9be00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 82 heartbeat osd_stat(store_statfs(0x4fdcd9000/0x0/0x4ffc00000, data 0x5c586/0xe3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 13189120 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:18.083424+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 82 handle_osd_map epochs [82,83], i have 82, src has [1,83]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 83 ms_handle_reset con 0x55c739db3c00 session 0x55c7390910e0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:19.083597+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:20.083794+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 83 heartbeat osd_stat(store_statfs(0x4fdcd4000/0x0/0x4ffc00000, data 0x5f16a/0xe9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.453618050s of 10.603578568s, submitted: 54
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 83 ms_handle_reset con 0x55c737c7bc00 session 0x55c7384cc000
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:21.083940+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 562680 data_alloc: 218103808 data_used: 163840
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67567616 unmapped: 13123584 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:22.084131+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 83 handle_osd_map epochs [83,84], i have 83, src has [1,84]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 84 ms_handle_reset con 0x55c739000400 session 0x55c738afd2c0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 13017088 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:23.084319+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 84 heartbeat osd_stat(store_statfs(0x4fdcd1000/0x0/0x4ffc00000, data 0x60752/0xeb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 84 ms_handle_reset con 0x55c739e16800 session 0x55c7389a4960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 84 ms_handle_reset con 0x55c739e16400 session 0x55c73798a960
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 13017088 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:24.084530+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 13017088 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:25.084728+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 13000704 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:26.084866+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 85 handle_osd_map epochs [85,86], i have 85, src has [1,86]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 86 ms_handle_reset con 0x55c737c7bc00 session 0x55c738d85e00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 569627 data_alloc: 218103808 data_used: 172032
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 12877824 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:27.085046+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 12877824 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:28.085226+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 86 heartbeat osd_stat(store_statfs(0x4fcb2e000/0x0/0x4ffc00000, data 0x631d9/0xee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 12877824 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:29.085394+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 86 ms_handle_reset con 0x55c739000400 session 0x55c738b12b40
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 12828672 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:30.085535+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 12828672 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:31.085705+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.347141266s of 10.605584145s, submitted: 81
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 87 ms_handle_reset con 0x55c739db3c00 session 0x55c738af8b40
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16800
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 571375 data_alloc: 218103808 data_used: 176128
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67903488 unmapped: 12787712 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:32.085843+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 88 ms_handle_reset con 0x55c739e16800 session 0x55c738af9e00
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:33.086027+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 88 heartbeat osd_stat(store_statfs(0x4fcb29000/0x0/0x4ffc00000, data 0x65c90/0xf3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 12746752 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:34.086208+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 12746752 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:35.086352+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:36.086474+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 578765 data_alloc: 218103808 data_used: 176128
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:37.086654+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:38.086856+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:39.087014+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 89 heartbeat osd_stat(store_statfs(0x4fcb27000/0x0/0x4ffc00000, data 0x6714c/0xf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:40.087181+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:41.087347+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 89 heartbeat osd_stat(store_statfs(0x4fcb27000/0x0/0x4ffc00000, data 0x6714c/0xf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 578765 data_alloc: 218103808 data_used: 176128
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:42.087503+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.186514854s of 11.261335373s, submitted: 63
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 12746752 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:43.087636+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 12738560 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:44.087783+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 12738560 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:45.087962+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:46.088148+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:47.088312+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:48.088440+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:49.088581+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:50.088733+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:51.088870+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:52.088993+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:53.089115+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:54.089340+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:55.089530+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:56.089667+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:57.089840+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:58.090000+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:59.090186+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:00.090395+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:01.090571+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:02.090710+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:03.090866+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:04.091012+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:05.091216+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:06.091350+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:07.091566+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:08.091769+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:09.091990+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:10.092186+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:11.092377+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:12.092593+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:13.092782+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:14.092970+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:15.093256+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:16.093538+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:17.093783+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:18.093941+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:19.094075+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:20.094273+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:21.094481+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:22.094728+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:23.094988+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:24.095227+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:25.095565+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:26.095765+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:27.096006+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:28.096203+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:29.096423+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:30.097624+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:31.100636+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:32.100855+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:33.101136+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:34.103687+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:35.105034+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:36.105503+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:37.105724+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:38.106813+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:39.107732+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:40.108499+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:41.108762+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:42.108927+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:43.109167+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:44.109385+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:45.109630+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:46.109801+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:47.109989+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:48.110327+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:49.110592+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:50.110803+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:51.110992+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:52.111126+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:53.111318+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:54.111536+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:55.111816+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:56.111978+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:57.112150+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:58.112317+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:59.112629+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:00.112852+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:01.113271+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:02.113455+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:03.113602+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:04.113767+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 12648448 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'config show' '{prefix=config show}'
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:05.114064+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 12165120 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:39:37 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:06.118605+0000)
Dec 01 09:39:37 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 12165120 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:39:37 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:39:37 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:39:37 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:39:37 compute-0 ceph-osd[88047]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:39:37 compute-0 ceph-mon[75031]: pgmap v872: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:37 compute-0 ceph-mon[75031]: from='client.14842 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:37 compute-0 ceph-mon[75031]: from='client.14848 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:37 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2162815706' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 09:39:37 compute-0 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 09:39:37 compute-0 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 09:39:37 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2980152995' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 09:39:37 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14858 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:37 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:39:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v873: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Dec 01 09:39:37 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3150115902' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 09:39:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df"} v 0) v1
Dec 01 09:39:38 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2036314466' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 09:39:38 compute-0 ceph-mon[75031]: from='client.14858 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:38 compute-0 ceph-mon[75031]: pgmap v873: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:38 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3150115902' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 09:39:38 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2036314466' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 09:39:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs dump"} v 0) v1
Dec 01 09:39:38 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3690527146' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 09:39:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Dec 01 09:39:39 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3830609254' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 09:39:39 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14868 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:39 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3690527146' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 09:39:39 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3830609254' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 09:39:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v874: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:39 compute-0 podman[262735]: 2025-12-01 09:39:39.976887336 +0000 UTC m=+0.068777970 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:39:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds stat"} v 0) v1
Dec 01 09:39:39 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3793354911' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 09:39:39 compute-0 systemd[1]: Starting Hostname Service...
Dec 01 09:39:40 compute-0 systemd[1]: Started Hostname Service.
Dec 01 09:39:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump"} v 0) v1
Dec 01 09:39:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1635193271' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 09:39:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:40 compute-0 ceph-mon[75031]: from='client.14868 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:40 compute-0 ceph-mon[75031]: pgmap v874: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:40 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3793354911' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 09:39:40 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1635193271' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 09:39:40 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14874 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd blocklist ls"} v 0) v1
Dec 01 09:39:41 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2289580041' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 09:39:41 compute-0 ceph-mon[75031]: from='client.14874 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:41 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2289580041' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 09:39:41 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14878 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v875: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:42 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14880 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd dump"} v 0) v1
Dec 01 09:39:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/722044415' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 09:39:42 compute-0 ceph-mon[75031]: from='client.14878 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:42 compute-0 ceph-mon[75031]: pgmap v875: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:42 compute-0 ceph-mon[75031]: from='client.14880 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:42 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/722044415' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 09:39:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd numa-status"} v 0) v1
Dec 01 09:39:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2919462963' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14886 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14888 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:39:43 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2919462963' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 09:39:43 compute-0 ceph-mon[75031]: from='client.14886 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v876: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0) v1
Dec 01 09:39:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2707625625' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd stat"} v 0) v1
Dec 01 09:39:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2276598157' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:39:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1973812972' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:39:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1973812972' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mon[75031]: from='client.14888 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mon[75031]: pgmap v876: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2707625625' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2276598157' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1973812972' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:39:44 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14895 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:45 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14900 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0) v1
Dec 01 09:39:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/709959604' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:39:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1973812972' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:39:45 compute-0 ceph-mon[75031]: from='client.14895 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:45 compute-0 ceph-mon[75031]: from='client.14900 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:39:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/709959604' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:39:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v877: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "time-sync-status"} v 0) v1
Dec 01 09:39:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2879919651' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 01 09:39:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0) v1
Dec 01 09:39:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/866786259' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:46 compute-0 ceph-mon[75031]: pgmap v877: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:46 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2879919651' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 01 09:39:46 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/866786259' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:46 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14908 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0) v1
Dec 01 09:39:47 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2828500858' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:39:47 compute-0 ovs-appctl[264161]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 01 09:39:47 compute-0 ovs-appctl[264167]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 01 09:39:47 compute-0 ovs-appctl[264174]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 01 09:39:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v878: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0) v1
Dec 01 09:39:47 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2802948051' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 01 09:39:47 compute-0 ceph-mon[75031]: from='client.14908 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:47 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2828500858' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:39:47 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2802948051' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 01 09:39:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0) v1
Dec 01 09:39:48 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3571746550' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0) v1
Dec 01 09:39:48 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1618562418' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 01 09:39:48 compute-0 ceph-mon[75031]: pgmap v878: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:48 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3571746550' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:48 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1618562418' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 01 09:39:49 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14918 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0) v1
Dec 01 09:39:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1142819566' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 01 09:39:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v879: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:49 compute-0 ceph-mon[75031]: from='client.14918 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:49 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1142819566' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 01 09:39:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0) v1
Dec 01 09:39:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1127419745' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:50 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14924 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0) v1
Dec 01 09:39:50 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2732425395' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 01 09:39:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:50 compute-0 ceph-mon[75031]: pgmap v879: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:50 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1127419745' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:50 compute-0 ceph-mon[75031]: from='client.14924 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:50 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2732425395' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 01 09:39:50 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14928 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:51 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14930 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:51 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0) v1
Dec 01 09:39:51 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2879488514' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v880: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:51 compute-0 ceph-mon[75031]: from='client.14928 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:51 compute-0 ceph-mon[75031]: from='client.14930 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:51 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2879488514' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 01 09:39:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0) v1
Dec 01 09:39:52 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/870625426' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14936 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14938 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:39:52 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:39:52 compute-0 ceph-mon[75031]: pgmap v880: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:52 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/870625426' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 01 09:39:52 compute-0 ceph-mon[75031]: from='client.14936 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0) v1
Dec 01 09:39:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2156995256' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:39:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0) v1
Dec 01 09:39:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1864275176' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 01 09:39:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v881: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:53 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14944 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:53 compute-0 ceph-mon[75031]: from='client.14938 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:53 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2156995256' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:39:53 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1864275176' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 01 09:39:54 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.14946 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:54 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Dec 01 09:39:54 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3465339922' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:39:54 compute-0 ceph-mon[75031]: pgmap v881: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:54 compute-0 ceph-mon[75031]: from='client.14944 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:54 compute-0 ceph-mon[75031]: from='client.14946 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:39:54 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3465339922' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 09:39:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0) v1
Dec 01 09:39:55 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1152992441' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 01 09:39:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:39:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v882: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:55 compute-0 podman[265673]: 2025-12-01 09:39:55.847666901 +0000 UTC m=+0.079130408 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:39:55 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1152992441' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 01 09:39:56 compute-0 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 01 09:39:56 compute-0 ceph-mon[75031]: pgmap v882: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v883: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:59 compute-0 ceph-mon[75031]: pgmap v883: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:39:59 compute-0 systemd[1]: Starting Time & Date Service...
Dec 01 09:39:59 compute-0 systemd[1]: Started Time & Date Service.
Dec 01 09:39:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v884: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:01 compute-0 ceph-mon[75031]: pgmap v884: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v885: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:03 compute-0 ceph-mon[75031]: pgmap v885: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v886: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:05 compute-0 podman[266121]: 2025-12-01 09:40:05.048115545 +0000 UTC m=+0.137872478 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 09:40:05 compute-0 ceph-mon[75031]: pgmap v886: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v887: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:07 compute-0 ceph-mon[75031]: pgmap v887: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v888: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:09 compute-0 ceph-mon[75031]: pgmap v888: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v889: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:10 compute-0 podman[266148]: 2025-12-01 09:40:10.185220397 +0000 UTC m=+0.063116147 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 09:40:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:11 compute-0 ceph-mon[75031]: pgmap v889: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v890: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:40:13
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['vms', 'backups', 'volumes', '.mgr', 'cephfs.cephfs.data', 'cephfs.cephfs.meta', 'images']
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:40:13 compute-0 ceph-mon[75031]: pgmap v890: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:40:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v891: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:15 compute-0 ceph-mon[75031]: pgmap v891: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v892: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:17 compute-0 ceph-mon[75031]: pgmap v892: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v893: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:18 compute-0 ceph-mon[75031]: pgmap v893: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:40:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:40:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v894: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:40:20.479 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:40:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:40:20.480 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:40:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:40:20.480 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:40:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:20 compute-0 ceph-mon[75031]: pgmap v894: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v895: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:22 compute-0 ceph-mon[75031]: pgmap v895: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v896: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:23 compute-0 sudo[258778]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:23 compute-0 sshd-session[258777]: Received disconnect from 192.168.122.10 port 47554:11: disconnected by user
Dec 01 09:40:23 compute-0 sshd-session[258777]: Disconnected from user zuul 192.168.122.10 port 47554
Dec 01 09:40:23 compute-0 sshd-session[258774]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:40:23 compute-0 systemd[1]: session-52.scope: Deactivated successfully.
Dec 01 09:40:23 compute-0 systemd[1]: session-52.scope: Consumed 2min 34.231s CPU time, 642.1M memory peak, read 219.7M from disk, written 65.7M to disk.
Dec 01 09:40:23 compute-0 systemd-logind[788]: Session 52 logged out. Waiting for processes to exit.
Dec 01 09:40:24 compute-0 systemd-logind[788]: Removed session 52.
Dec 01 09:40:24 compute-0 sshd-session[266167]: Accepted publickey for zuul from 192.168.122.10 port 45170 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:40:24 compute-0 systemd-logind[788]: New session 53 of user zuul.
Dec 01 09:40:24 compute-0 systemd[1]: Started Session 53 of User zuul.
Dec 01 09:40:24 compute-0 sshd-session[266167]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:40:24 compute-0 sudo[266171]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-12-01-bnnjfxy.tar.xz
Dec 01 09:40:24 compute-0 sudo[266171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:24 compute-0 sudo[266171]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:24 compute-0 sshd-session[266170]: Received disconnect from 192.168.122.10 port 45170:11: disconnected by user
Dec 01 09:40:24 compute-0 sshd-session[266170]: Disconnected from user zuul 192.168.122.10 port 45170
Dec 01 09:40:24 compute-0 sshd-session[266167]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:40:24 compute-0 systemd[1]: session-53.scope: Deactivated successfully.
Dec 01 09:40:24 compute-0 systemd-logind[788]: Session 53 logged out. Waiting for processes to exit.
Dec 01 09:40:24 compute-0 systemd-logind[788]: Removed session 53.
Dec 01 09:40:24 compute-0 sshd-session[266196]: Accepted publickey for zuul from 192.168.122.10 port 45182 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:40:24 compute-0 systemd-logind[788]: New session 54 of user zuul.
Dec 01 09:40:24 compute-0 systemd[1]: Started Session 54 of User zuul.
Dec 01 09:40:24 compute-0 sshd-session[266196]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:40:24 compute-0 sudo[266200]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 01 09:40:24 compute-0 sudo[266200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:24 compute-0 sudo[266200]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:24 compute-0 sshd-session[266199]: Received disconnect from 192.168.122.10 port 45182:11: disconnected by user
Dec 01 09:40:24 compute-0 sshd-session[266199]: Disconnected from user zuul 192.168.122.10 port 45182
Dec 01 09:40:24 compute-0 sshd-session[266196]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:40:24 compute-0 systemd[1]: session-54.scope: Deactivated successfully.
Dec 01 09:40:24 compute-0 systemd-logind[788]: Session 54 logged out. Waiting for processes to exit.
Dec 01 09:40:24 compute-0 systemd-logind[788]: Removed session 54.
Dec 01 09:40:25 compute-0 ceph-mon[75031]: pgmap v896: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v897: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:25 compute-0 podman[266225]: 2025-12-01 09:40:25.985182093 +0000 UTC m=+0.082949688 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:40:27 compute-0 ceph-mon[75031]: pgmap v897: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v898: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:29 compute-0 ceph-mon[75031]: pgmap v898: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:29 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 09:40:29 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 09:40:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v899: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:31 compute-0 ceph-mon[75031]: pgmap v899: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v900: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:32 compute-0 nova_compute[250706]: 2025-12-01 09:40:32.049 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:33 compute-0 nova_compute[250706]: 2025-12-01 09:40:33.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:33 compute-0 ceph-mon[75031]: pgmap v900: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v901: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:34 compute-0 nova_compute[250706]: 2025-12-01 09:40:34.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:34 compute-0 nova_compute[250706]: 2025-12-01 09:40:34.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:40:34 compute-0 nova_compute[250706]: 2025-12-01 09:40:34.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:40:34 compute-0 sshd-session[266251]: Connection closed by 172.236.228.220 port 43978 [preauth]
Dec 01 09:40:34 compute-0 nova_compute[250706]: 2025-12-01 09:40:34.226 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:40:34 compute-0 nova_compute[250706]: 2025-12-01 09:40:34.226 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:34 compute-0 sshd-session[266253]: Connection closed by 172.236.228.220 port 43982 [preauth]
Dec 01 09:40:34 compute-0 sshd-session[266255]: Connection closed by 172.236.228.220 port 43998 [preauth]
Dec 01 09:40:35 compute-0 nova_compute[250706]: 2025-12-01 09:40:35.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:35 compute-0 nova_compute[250706]: 2025-12-01 09:40:35.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:35 compute-0 nova_compute[250706]: 2025-12-01 09:40:35.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:35 compute-0 ceph-mon[75031]: pgmap v901: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #45. Immutable memtables: 0.
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.658049) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 45
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582035658169, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 917, "num_deletes": 255, "total_data_size": 752734, "memory_usage": 770328, "flush_reason": "Manual Compaction"}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #46: started
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582035669407, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 46, "file_size": 742049, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18181, "largest_seqno": 19097, "table_properties": {"data_size": 737434, "index_size": 2139, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10912, "raw_average_key_size": 19, "raw_value_size": 727694, "raw_average_value_size": 1299, "num_data_blocks": 96, "num_entries": 560, "num_filter_entries": 560, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764581969, "oldest_key_time": 1764581969, "file_creation_time": 1764582035, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 46, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 11371 microseconds, and 6018 cpu microseconds.
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.669484) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #46: 742049 bytes OK
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.669510) [db/memtable_list.cc:519] [default] Level-0 commit table #46 started
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.670988) [db/memtable_list.cc:722] [default] Level-0 commit table #46: memtable #1 done
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.671003) EVENT_LOG_v1 {"time_micros": 1764582035670998, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.671037) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 748052, prev total WAL file size 748052, number of live WAL files 2.
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000042.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.671741) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323530' seq:72057594037927935, type:22 .. '6C6F676D00353031' seq:0, type:0; will stop at (end)
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [46(724KB)], [44(4629KB)]
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582035671775, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [46], "files_L6": [44], "score": -1, "input_data_size": 5482926, "oldest_snapshot_seqno": -1}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #47: 3810 keys, 5383639 bytes, temperature: kUnknown
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582035707474, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 47, "file_size": 5383639, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5356721, "index_size": 16313, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9541, "raw_key_size": 92267, "raw_average_key_size": 24, "raw_value_size": 5286668, "raw_average_value_size": 1387, "num_data_blocks": 696, "num_entries": 3810, "num_filter_entries": 3810, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764582035, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.707854) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 5383639 bytes
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.709816) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.1 rd, 150.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 4.5 +0.0 blob) out(5.1 +0.0 blob), read-write-amplify(14.6) write-amplify(7.3) OK, records in: 4332, records dropped: 522 output_compression: NoCompression
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.709849) EVENT_LOG_v1 {"time_micros": 1764582035709834, "job": 22, "event": "compaction_finished", "compaction_time_micros": 35808, "compaction_time_cpu_micros": 14955, "output_level": 6, "num_output_files": 1, "total_output_size": 5383639, "num_input_records": 4332, "num_output_records": 3810, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000046.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582035710235, "job": 22, "event": "table_file_deletion", "file_number": 46}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582035711824, "job": 22, "event": "table_file_deletion", "file_number": 44}
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.671561) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.711939) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.711947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.711951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.711955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:40:35 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:40:35.711959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:40:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v902: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:36 compute-0 podman[266257]: 2025-12-01 09:40:36.035808022 +0000 UTC m=+0.123312810 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 01 09:40:36 compute-0 nova_compute[250706]: 2025-12-01 09:40:36.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:36 compute-0 sudo[266284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:36 compute-0 sudo[266284]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:36 compute-0 sudo[266284]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:36 compute-0 sudo[266309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:40:36 compute-0 sudo[266309]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:36 compute-0 sudo[266309]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:36 compute-0 sudo[266334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:36 compute-0 sudo[266334]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:36 compute-0 sudo[266334]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:36 compute-0 sudo[266359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:40:36 compute-0 sudo[266359]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:36 compute-0 ceph-mon[75031]: pgmap v902: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:36 compute-0 nova_compute[250706]: 2025-12-01 09:40:36.868 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:40:36 compute-0 nova_compute[250706]: 2025-12-01 09:40:36.870 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:40:36 compute-0 nova_compute[250706]: 2025-12-01 09:40:36.870 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:40:36 compute-0 nova_compute[250706]: 2025-12-01 09:40:36.871 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:40:36 compute-0 nova_compute[250706]: 2025-12-01 09:40:36.872 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:40:37 compute-0 sudo[266359]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:40:37 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:40:37 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:40:37 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:40:37 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev e424af8b-fcd6-465b-b71e-eddf62211960 does not exist
Dec 01 09:40:37 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 3ce4a272-127a-4595-9d5a-c89714f4a36f does not exist
Dec 01 09:40:37 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 70e7b7af-7e8c-4c20-bef5-de9bc07f3471 does not exist
Dec 01 09:40:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:40:37 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:40:37 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:40:37 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:40:37 compute-0 sudo[266435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:37 compute-0 sudo[266435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:37 compute-0 sudo[266435]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:37 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:40:37 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3450527299' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.341 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:40:37 compute-0 sudo[266460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:40:37 compute-0 sudo[266460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:37 compute-0 sudo[266460]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:37 compute-0 sudo[266487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:37 compute-0 sudo[266487]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:37 compute-0 sudo[266487]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:37 compute-0 sudo[266512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:40:37 compute-0 sudo[266512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.511 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.512 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5136MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.512 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.513 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.603 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.603 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:40:37 compute-0 nova_compute[250706]: 2025-12-01 09:40:37.634 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:40:37 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:40:37 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3450527299' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:40:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v903: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:37 compute-0 podman[266588]: 2025-12-01 09:40:37.809022337 +0000 UTC m=+0.049960099 container create 7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_burnell, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:40:37 compute-0 systemd[1]: Started libpod-conmon-7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195.scope.
Dec 01 09:40:37 compute-0 podman[266588]: 2025-12-01 09:40:37.780020553 +0000 UTC m=+0.020958325 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:40:37 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:40:37 compute-0 podman[266588]: 2025-12-01 09:40:37.930779801 +0000 UTC m=+0.171717613 container init 7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_burnell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef)
Dec 01 09:40:37 compute-0 podman[266588]: 2025-12-01 09:40:37.937787542 +0000 UTC m=+0.178725334 container start 7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_burnell, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:40:37 compute-0 podman[266588]: 2025-12-01 09:40:37.941369775 +0000 UTC m=+0.182307567 container attach 7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_burnell, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:40:37 compute-0 systemd[1]: libpod-7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195.scope: Deactivated successfully.
Dec 01 09:40:37 compute-0 strange_burnell[266613]: 167 167
Dec 01 09:40:37 compute-0 conmon[266613]: conmon 7838ed26167195210c2c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195.scope/container/memory.events
Dec 01 09:40:37 compute-0 podman[266588]: 2025-12-01 09:40:37.94640337 +0000 UTC m=+0.187341202 container died 7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_burnell, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef)
Dec 01 09:40:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-3479c73ea911e6143598213da9b4a358468a70d313dc42497aad6962df20d6f0-merged.mount: Deactivated successfully.
Dec 01 09:40:37 compute-0 podman[266588]: 2025-12-01 09:40:37.990427607 +0000 UTC m=+0.231365359 container remove 7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_burnell, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:40:38 compute-0 systemd[1]: libpod-conmon-7838ed26167195210c2c9a35dbd0e9285d9bde94dabbcb6dcebe59a60e4fe195.scope: Deactivated successfully.
Dec 01 09:40:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:40:38 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/612593680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:40:38 compute-0 nova_compute[250706]: 2025-12-01 09:40:38.076 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:40:38 compute-0 nova_compute[250706]: 2025-12-01 09:40:38.087 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:40:38 compute-0 nova_compute[250706]: 2025-12-01 09:40:38.125 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:40:38 compute-0 nova_compute[250706]: 2025-12-01 09:40:38.128 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:40:38 compute-0 nova_compute[250706]: 2025-12-01 09:40:38.129 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:40:38 compute-0 podman[266639]: 2025-12-01 09:40:38.166674819 +0000 UTC m=+0.043685839 container create 6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_franklin, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:40:38 compute-0 systemd[1]: Started libpod-conmon-6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e.scope.
Dec 01 09:40:38 compute-0 podman[266639]: 2025-12-01 09:40:38.144883481 +0000 UTC m=+0.021894511 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:40:38 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:40:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecde39194be1461bf1413fea891528ba8308bd18835e0c4db989cff4e679025b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecde39194be1461bf1413fea891528ba8308bd18835e0c4db989cff4e679025b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecde39194be1461bf1413fea891528ba8308bd18835e0c4db989cff4e679025b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecde39194be1461bf1413fea891528ba8308bd18835e0c4db989cff4e679025b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecde39194be1461bf1413fea891528ba8308bd18835e0c4db989cff4e679025b/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:38 compute-0 podman[266639]: 2025-12-01 09:40:38.276643873 +0000 UTC m=+0.153654963 container init 6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_franklin, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:40:38 compute-0 podman[266639]: 2025-12-01 09:40:38.28940289 +0000 UTC m=+0.166413920 container start 6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_franklin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:40:38 compute-0 podman[266639]: 2025-12-01 09:40:38.293654962 +0000 UTC m=+0.170665992 container attach 6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_franklin, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:40:38 compute-0 ceph-mon[75031]: pgmap v903: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:38 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/612593680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:40:39 compute-0 nova_compute[250706]: 2025-12-01 09:40:39.125 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:39 compute-0 nova_compute[250706]: 2025-12-01 09:40:39.151 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:40:39 compute-0 nova_compute[250706]: 2025-12-01 09:40:39.151 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:40:39 compute-0 condescending_franklin[266655]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:40:39 compute-0 condescending_franklin[266655]: --> relative data size: 1.0
Dec 01 09:40:39 compute-0 condescending_franklin[266655]: --> All data devices are unavailable
Dec 01 09:40:39 compute-0 systemd[1]: libpod-6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e.scope: Deactivated successfully.
Dec 01 09:40:39 compute-0 podman[266639]: 2025-12-01 09:40:39.483042767 +0000 UTC m=+1.360053787 container died 6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_franklin, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:40:39 compute-0 systemd[1]: libpod-6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e.scope: Consumed 1.131s CPU time.
Dec 01 09:40:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-ecde39194be1461bf1413fea891528ba8308bd18835e0c4db989cff4e679025b-merged.mount: Deactivated successfully.
Dec 01 09:40:39 compute-0 podman[266639]: 2025-12-01 09:40:39.559521508 +0000 UTC m=+1.436532498 container remove 6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_franklin, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:40:39 compute-0 systemd[1]: libpod-conmon-6ec02990248a6c27f0732eecd76950d796fb244100717ffb1a0925638e00e35e.scope: Deactivated successfully.
Dec 01 09:40:39 compute-0 sudo[266512]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:39 compute-0 sudo[266698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:39 compute-0 sudo[266698]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:39 compute-0 sudo[266698]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:39 compute-0 sudo[266723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:40:39 compute-0 sudo[266723]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:39 compute-0 sudo[266723]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v904: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:39 compute-0 sudo[266748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:39 compute-0 sudo[266748]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:39 compute-0 sudo[266748]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:39 compute-0 sudo[266773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:40:39 compute-0 sudo[266773]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:40 compute-0 podman[266839]: 2025-12-01 09:40:40.277412095 +0000 UTC m=+0.057194057 container create d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jemison, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:40:40 compute-0 systemd[1]: Started libpod-conmon-d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84.scope.
Dec 01 09:40:40 compute-0 podman[266839]: 2025-12-01 09:40:40.248771651 +0000 UTC m=+0.028553663 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:40:40 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:40:40 compute-0 podman[266839]: 2025-12-01 09:40:40.363571315 +0000 UTC m=+0.143353337 container init d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jemison, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:40:40 compute-0 podman[266839]: 2025-12-01 09:40:40.374446077 +0000 UTC m=+0.154227999 container start d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jemison, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:40:40 compute-0 podman[266839]: 2025-12-01 09:40:40.377990909 +0000 UTC m=+0.157772831 container attach d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jemison, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Dec 01 09:40:40 compute-0 zealous_jemison[266856]: 167 167
Dec 01 09:40:40 compute-0 systemd[1]: libpod-d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84.scope: Deactivated successfully.
Dec 01 09:40:40 compute-0 podman[266839]: 2025-12-01 09:40:40.381670085 +0000 UTC m=+0.161452077 container died d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jemison, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:40:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-c9d3faa9dba5ef3c09c08d59f5b18d405f20e6df2cbfdb2075948c8296c9de22-merged.mount: Deactivated successfully.
Dec 01 09:40:40 compute-0 podman[266853]: 2025-12-01 09:40:40.42075665 +0000 UTC m=+0.102664765 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 09:40:40 compute-0 podman[266839]: 2025-12-01 09:40:40.428020869 +0000 UTC m=+0.207802801 container remove d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jemison, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:40:40 compute-0 systemd[1]: libpod-conmon-d79b76689a8120a0856a687f0855e7d28c3bd4080beb1041107a1f1dae869d84.scope: Deactivated successfully.
Dec 01 09:40:40 compute-0 podman[266902]: 2025-12-01 09:40:40.617135671 +0000 UTC m=+0.055346614 container create f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_pasteur, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:40:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:40 compute-0 systemd[1]: Started libpod-conmon-f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a.scope.
Dec 01 09:40:40 compute-0 podman[266902]: 2025-12-01 09:40:40.589191287 +0000 UTC m=+0.027402300 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:40:40 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:40:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a7980ae0df1fc9704fa18106660beec2d6af0eb169fc04ac5a84343b514aed/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a7980ae0df1fc9704fa18106660beec2d6af0eb169fc04ac5a84343b514aed/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a7980ae0df1fc9704fa18106660beec2d6af0eb169fc04ac5a84343b514aed/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a7980ae0df1fc9704fa18106660beec2d6af0eb169fc04ac5a84343b514aed/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:40 compute-0 podman[266902]: 2025-12-01 09:40:40.752344712 +0000 UTC m=+0.190555745 container init f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_pasteur, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Dec 01 09:40:40 compute-0 podman[266902]: 2025-12-01 09:40:40.765072318 +0000 UTC m=+0.203283291 container start f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_pasteur, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:40:40 compute-0 podman[266902]: 2025-12-01 09:40:40.769428623 +0000 UTC m=+0.207639596 container attach f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_pasteur, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef)
Dec 01 09:40:40 compute-0 ceph-mon[75031]: pgmap v904: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:41 compute-0 determined_pasteur[266919]: {
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:     "0": [
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:         {
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "devices": [
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "/dev/loop3"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             ],
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_name": "ceph_lv0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_size": "21470642176",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "name": "ceph_lv0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "tags": {
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cluster_name": "ceph",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.crush_device_class": "",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.encrypted": "0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osd_id": "0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.type": "block",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.vdo": "0"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             },
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "type": "block",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "vg_name": "ceph_vg0"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:         }
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:     ],
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:     "1": [
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:         {
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "devices": [
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "/dev/loop4"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             ],
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_name": "ceph_lv1",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_size": "21470642176",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "name": "ceph_lv1",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "tags": {
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cluster_name": "ceph",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.crush_device_class": "",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.encrypted": "0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osd_id": "1",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.type": "block",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.vdo": "0"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             },
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "type": "block",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "vg_name": "ceph_vg1"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:         }
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:     ],
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:     "2": [
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:         {
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "devices": [
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "/dev/loop5"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             ],
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_name": "ceph_lv2",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_size": "21470642176",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "name": "ceph_lv2",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "tags": {
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.cluster_name": "ceph",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.crush_device_class": "",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.encrypted": "0",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osd_id": "2",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.type": "block",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:                 "ceph.vdo": "0"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             },
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "type": "block",
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:             "vg_name": "ceph_vg2"
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:         }
Dec 01 09:40:41 compute-0 determined_pasteur[266919]:     ]
Dec 01 09:40:41 compute-0 determined_pasteur[266919]: }
Dec 01 09:40:41 compute-0 systemd[1]: libpod-f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a.scope: Deactivated successfully.
Dec 01 09:40:41 compute-0 podman[266902]: 2025-12-01 09:40:41.602259608 +0000 UTC m=+1.040470631 container died f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_pasteur, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:40:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-08a7980ae0df1fc9704fa18106660beec2d6af0eb169fc04ac5a84343b514aed-merged.mount: Deactivated successfully.
Dec 01 09:40:41 compute-0 podman[266902]: 2025-12-01 09:40:41.681529859 +0000 UTC m=+1.119740802 container remove f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=determined_pasteur, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True)
Dec 01 09:40:41 compute-0 systemd[1]: libpod-conmon-f801b8c773f835150e806f91443bb19186081c85710c81e93893302b7f462a6a.scope: Deactivated successfully.
Dec 01 09:40:41 compute-0 sudo[266773]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v905: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:41 compute-0 sudo[266942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:41 compute-0 sudo[266942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:41 compute-0 sudo[266942]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:41 compute-0 sudo[266967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:40:41 compute-0 sudo[266967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:41 compute-0 sudo[266967]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:41 compute-0 sudo[266992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:41 compute-0 sudo[266992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:41 compute-0 sudo[266992]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:42 compute-0 sudo[267017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:40:42 compute-0 sudo[267017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:42 compute-0 podman[267083]: 2025-12-01 09:40:42.457486768 +0000 UTC m=+0.039711324 container create 81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_goldstine, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:40:42 compute-0 systemd[1]: Started libpod-conmon-81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34.scope.
Dec 01 09:40:42 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:40:42 compute-0 podman[267083]: 2025-12-01 09:40:42.441075866 +0000 UTC m=+0.023300442 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:40:42 compute-0 podman[267083]: 2025-12-01 09:40:42.55243928 +0000 UTC m=+0.134663916 container init 81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_goldstine, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:40:42 compute-0 podman[267083]: 2025-12-01 09:40:42.55973287 +0000 UTC m=+0.141957466 container start 81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_goldstine, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:40:42 compute-0 podman[267083]: 2025-12-01 09:40:42.563683774 +0000 UTC m=+0.145908370 container attach 81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_goldstine, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:40:42 compute-0 eager_goldstine[267099]: 167 167
Dec 01 09:40:42 compute-0 systemd[1]: libpod-81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34.scope: Deactivated successfully.
Dec 01 09:40:42 compute-0 podman[267083]: 2025-12-01 09:40:42.570709436 +0000 UTC m=+0.152934032 container died 81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_goldstine, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:40:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-ee27d351a46c607ce9b06001e785e74c886ad378fa0a12133af0c5ba3b426caf-merged.mount: Deactivated successfully.
Dec 01 09:40:42 compute-0 podman[267083]: 2025-12-01 09:40:42.622574248 +0000 UTC m=+0.204798804 container remove 81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eager_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:40:42 compute-0 systemd[1]: libpod-conmon-81cc0f6edd9bb2aa7e88279a770371ec7afad03fb3af29abb8e66be76e734c34.scope: Deactivated successfully.
Dec 01 09:40:42 compute-0 ceph-mon[75031]: pgmap v905: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:42 compute-0 podman[267125]: 2025-12-01 09:40:42.869175064 +0000 UTC m=+0.076105871 container create 796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_nightingale, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:40:42 compute-0 systemd[1]: Started libpod-conmon-796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281.scope.
Dec 01 09:40:42 compute-0 podman[267125]: 2025-12-01 09:40:42.83776037 +0000 UTC m=+0.044691217 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:40:42 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:40:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daf0f1f5decfbbeb214ce7ab8b50d3f6215c631eafc25c737fc0e03613d951bc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daf0f1f5decfbbeb214ce7ab8b50d3f6215c631eafc25c737fc0e03613d951bc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daf0f1f5decfbbeb214ce7ab8b50d3f6215c631eafc25c737fc0e03613d951bc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daf0f1f5decfbbeb214ce7ab8b50d3f6215c631eafc25c737fc0e03613d951bc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:40:42 compute-0 podman[267125]: 2025-12-01 09:40:42.970154769 +0000 UTC m=+0.177085576 container init 796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_nightingale, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:40:42 compute-0 podman[267125]: 2025-12-01 09:40:42.983943866 +0000 UTC m=+0.190874673 container start 796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_nightingale, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 01 09:40:42 compute-0 podman[267125]: 2025-12-01 09:40:42.988547578 +0000 UTC m=+0.195478365 container attach 796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_nightingale, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:40:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:40:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:40:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:40:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:40:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:40:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:40:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v906: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:43 compute-0 nice_nightingale[267141]: {
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "osd_id": 0,
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "type": "bluestore"
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:     },
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "osd_id": 1,
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "type": "bluestore"
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:     },
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "osd_id": 2,
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:         "type": "bluestore"
Dec 01 09:40:43 compute-0 nice_nightingale[267141]:     }
Dec 01 09:40:43 compute-0 nice_nightingale[267141]: }
Dec 01 09:40:44 compute-0 systemd[1]: libpod-796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281.scope: Deactivated successfully.
Dec 01 09:40:44 compute-0 systemd[1]: libpod-796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281.scope: Consumed 1.045s CPU time.
Dec 01 09:40:44 compute-0 podman[267125]: 2025-12-01 09:40:44.020199385 +0000 UTC m=+1.227130162 container died 796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_nightingale, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:40:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-daf0f1f5decfbbeb214ce7ab8b50d3f6215c631eafc25c737fc0e03613d951bc-merged.mount: Deactivated successfully.
Dec 01 09:40:44 compute-0 podman[267125]: 2025-12-01 09:40:44.092594678 +0000 UTC m=+1.299525455 container remove 796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_nightingale, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:40:44 compute-0 systemd[1]: libpod-conmon-796ff10924c5d21282cc9967b77ddb99b2191ab564122916fa7df4b5423a0281.scope: Deactivated successfully.
Dec 01 09:40:44 compute-0 sudo[267017]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:40:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:40:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:40:44 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:40:44 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 3722aac0-65c5-468d-9489-b6a3b3ff1fb7 does not exist
Dec 01 09:40:44 compute-0 sudo[267188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:40:44 compute-0 sudo[267188]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:44 compute-0 sudo[267188]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:44 compute-0 sudo[267213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:40:44 compute-0 sudo[267213]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:40:44 compute-0 sudo[267213]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:40:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2192770870' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:40:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:40:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2192770870' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:40:45 compute-0 ceph-mon[75031]: pgmap v906: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:40:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:40:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2192770870' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:40:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2192770870' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:40:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v907: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:47 compute-0 ceph-mon[75031]: pgmap v907: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v908: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:49 compute-0 ceph-mon[75031]: pgmap v908: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v909: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:51 compute-0 ceph-mon[75031]: pgmap v909: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v910: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:53 compute-0 ceph-mon[75031]: pgmap v910: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v911: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:55 compute-0 ceph-mon[75031]: pgmap v911: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:40:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v912: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:56 compute-0 podman[267238]: 2025-12-01 09:40:56.988974838 +0000 UTC m=+0.089978866 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 09:40:57 compute-0 ceph-mon[75031]: pgmap v912: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v913: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:59 compute-0 ceph-mon[75031]: pgmap v913: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:40:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v914: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:01 compute-0 ceph-mon[75031]: pgmap v914: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v915: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:03 compute-0 ceph-mon[75031]: pgmap v915: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v916: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:05 compute-0 ceph-mon[75031]: pgmap v916: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v917: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:07 compute-0 podman[267259]: 2025-12-01 09:41:07.03937928 +0000 UTC m=+0.136887614 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 01 09:41:07 compute-0 ceph-mon[75031]: pgmap v917: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v918: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:09 compute-0 ceph-mon[75031]: pgmap v918: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v919: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:10 compute-0 podman[267285]: 2025-12-01 09:41:10.951926828 +0000 UTC m=+0.059453170 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 01 09:41:11 compute-0 ceph-mon[75031]: pgmap v919: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v920: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:41:13
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'images', 'volumes', '.mgr', 'vms', 'cephfs.cephfs.data', 'cephfs.cephfs.meta']
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:41:13 compute-0 ceph-mon[75031]: pgmap v920: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v921: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:15 compute-0 ceph-mon[75031]: pgmap v921: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v922: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:17 compute-0 ceph-mon[75031]: pgmap v922: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v923: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:41:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:41:19 compute-0 ceph-mon[75031]: pgmap v923: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v924: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:41:20.480 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:41:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:41:20.481 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:41:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:41:20.481 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:41:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:21 compute-0 ceph-mon[75031]: pgmap v924: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v925: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:23 compute-0 ceph-mon[75031]: pgmap v925: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v926: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:25 compute-0 ceph-mon[75031]: pgmap v926: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v927: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:27 compute-0 ceph-mon[75031]: pgmap v927: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v928: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:28 compute-0 podman[267304]: 2025-12-01 09:41:28.001246659 +0000 UTC m=+0.101566039 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3)
Dec 01 09:41:28 compute-0 ceph-mon[75031]: pgmap v928: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v929: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:30 compute-0 ceph-mon[75031]: pgmap v929: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v930: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:32 compute-0 ceph-mon[75031]: pgmap v930: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:33 compute-0 nova_compute[250706]: 2025-12-01 09:41:33.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:33 compute-0 nova_compute[250706]: 2025-12-01 09:41:33.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v931: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:34 compute-0 nova_compute[250706]: 2025-12-01 09:41:34.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:34 compute-0 nova_compute[250706]: 2025-12-01 09:41:34.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:41:34 compute-0 nova_compute[250706]: 2025-12-01 09:41:34.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:41:34 compute-0 nova_compute[250706]: 2025-12-01 09:41:34.082 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:41:34 compute-0 nova_compute[250706]: 2025-12-01 09:41:34.082 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:34 compute-0 ceph-mon[75031]: pgmap v931: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v932: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:36 compute-0 nova_compute[250706]: 2025-12-01 09:41:36.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:36 compute-0 nova_compute[250706]: 2025-12-01 09:41:36.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:36 compute-0 ceph-mon[75031]: pgmap v932: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:37 compute-0 nova_compute[250706]: 2025-12-01 09:41:37.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v933: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:38 compute-0 podman[267325]: 2025-12-01 09:41:38.003259311 +0000 UTC m=+0.109283552 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.091 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.092 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.092 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.093 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.094 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:41:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:41:38 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1443264099' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.574 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.799 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.800 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5164MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.801 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.801 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:41:38 compute-0 ceph-mon[75031]: pgmap v933: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:38 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1443264099' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.894 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.895 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:41:38 compute-0 nova_compute[250706]: 2025-12-01 09:41:38.922 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:41:39 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:41:39 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2519439549' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:41:39 compute-0 nova_compute[250706]: 2025-12-01 09:41:39.363 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:41:39 compute-0 nova_compute[250706]: 2025-12-01 09:41:39.370 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:41:39 compute-0 nova_compute[250706]: 2025-12-01 09:41:39.395 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:41:39 compute-0 nova_compute[250706]: 2025-12-01 09:41:39.399 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:41:39 compute-0 nova_compute[250706]: 2025-12-01 09:41:39.400 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:41:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v934: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:39 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2519439549' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:41:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:40 compute-0 ceph-mon[75031]: pgmap v934: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v935: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:41 compute-0 podman[267396]: 2025-12-01 09:41:41.948328571 +0000 UTC m=+0.056821013 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 09:41:42 compute-0 ceph-mon[75031]: pgmap v935: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:41:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:41:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:41:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:41:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:41:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:41:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v936: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:44 compute-0 sudo[267415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:44 compute-0 sudo[267415]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:44 compute-0 sudo[267415]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:41:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/769040397' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:41:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:41:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/769040397' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:41:44 compute-0 sudo[267440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:41:44 compute-0 sudo[267440]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:44 compute-0 sudo[267440]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:44 compute-0 sudo[267465]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:44 compute-0 sudo[267465]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:44 compute-0 sudo[267465]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:44 compute-0 sudo[267490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:41:44 compute-0 sudo[267490]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:44 compute-0 ceph-mon[75031]: pgmap v936: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/769040397' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:41:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/769040397' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:41:45 compute-0 sudo[267490]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:41:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:41:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:41:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:41:45 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 1b938328-a62c-48f9-a207-dfb71916371c does not exist
Dec 01 09:41:45 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 0a2679dc-95dc-4e89-8064-8c2bffdc981d does not exist
Dec 01 09:41:45 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev d4126b1b-1f31-4fdc-b3d5-bb0f2dbba022 does not exist
Dec 01 09:41:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:41:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:41:45 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:41:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:41:45 compute-0 sudo[267547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:45 compute-0 sudo[267547]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:45 compute-0 sudo[267547]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:45 compute-0 sudo[267572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:41:45 compute-0 sudo[267572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:45 compute-0 sudo[267572]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:45 compute-0 sudo[267597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:45 compute-0 sudo[267597]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:45 compute-0 sudo[267597]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:45 compute-0 sudo[267622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:41:45 compute-0 sudo[267622]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v937: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:41:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:41:45 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:41:46 compute-0 podman[267685]: 2025-12-01 09:41:46.040081638 +0000 UTC m=+0.051387628 container create 1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_euclid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:41:46 compute-0 systemd[1]: Started libpod-conmon-1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482.scope.
Dec 01 09:41:46 compute-0 podman[267685]: 2025-12-01 09:41:46.013643068 +0000 UTC m=+0.024949138 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:41:46 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:41:46 compute-0 podman[267685]: 2025-12-01 09:41:46.138710862 +0000 UTC m=+0.150016932 container init 1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_euclid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:41:46 compute-0 podman[267685]: 2025-12-01 09:41:46.14873087 +0000 UTC m=+0.160036870 container start 1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_euclid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:41:46 compute-0 podman[267685]: 2025-12-01 09:41:46.153189048 +0000 UTC m=+0.164495168 container attach 1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_euclid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:41:46 compute-0 affectionate_euclid[267701]: 167 167
Dec 01 09:41:46 compute-0 systemd[1]: libpod-1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482.scope: Deactivated successfully.
Dec 01 09:41:46 compute-0 conmon[267701]: conmon 1f44831039a5c9ed6ec1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482.scope/container/memory.events
Dec 01 09:41:46 compute-0 podman[267685]: 2025-12-01 09:41:46.157797901 +0000 UTC m=+0.169103911 container died 1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_euclid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:41:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-e7e87d57e9c0d27964eda6e9b889e272bc7be47f752a6a4a723bb76ff5fc7e7c-merged.mount: Deactivated successfully.
Dec 01 09:41:46 compute-0 podman[267685]: 2025-12-01 09:41:46.204625416 +0000 UTC m=+0.215931406 container remove 1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_euclid, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:41:46 compute-0 systemd[1]: libpod-conmon-1f44831039a5c9ed6ec1a2f8a011ff6d67672423a1c76702be6524e630f7c482.scope: Deactivated successfully.
Dec 01 09:41:46 compute-0 podman[267726]: 2025-12-01 09:41:46.457319978 +0000 UTC m=+0.058098480 container create 51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_goodall, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:41:46 compute-0 podman[267726]: 2025-12-01 09:41:46.433159344 +0000 UTC m=+0.033937886 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:41:46 compute-0 systemd[1]: Started libpod-conmon-51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34.scope.
Dec 01 09:41:46 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:41:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45350c9f0b5cbc6c9c52882a9acae752461acf06351b18964ef6b11cb04af717/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45350c9f0b5cbc6c9c52882a9acae752461acf06351b18964ef6b11cb04af717/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45350c9f0b5cbc6c9c52882a9acae752461acf06351b18964ef6b11cb04af717/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45350c9f0b5cbc6c9c52882a9acae752461acf06351b18964ef6b11cb04af717/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45350c9f0b5cbc6c9c52882a9acae752461acf06351b18964ef6b11cb04af717/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:46 compute-0 podman[267726]: 2025-12-01 09:41:46.661428134 +0000 UTC m=+0.262206656 container init 51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:41:46 compute-0 podman[267726]: 2025-12-01 09:41:46.672451711 +0000 UTC m=+0.273230203 container start 51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_goodall, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:41:46 compute-0 podman[267726]: 2025-12-01 09:41:46.816095559 +0000 UTC m=+0.416874051 container attach 51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_goodall, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:41:47 compute-0 ceph-mon[75031]: pgmap v937: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v938: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:47 compute-0 xenodochial_goodall[267743]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:41:47 compute-0 xenodochial_goodall[267743]: --> relative data size: 1.0
Dec 01 09:41:47 compute-0 xenodochial_goodall[267743]: --> All data devices are unavailable
Dec 01 09:41:47 compute-0 systemd[1]: libpod-51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34.scope: Deactivated successfully.
Dec 01 09:41:47 compute-0 podman[267726]: 2025-12-01 09:41:47.923414642 +0000 UTC m=+1.524193174 container died 51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_goodall, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 01 09:41:47 compute-0 systemd[1]: libpod-51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34.scope: Consumed 1.116s CPU time.
Dec 01 09:41:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-45350c9f0b5cbc6c9c52882a9acae752461acf06351b18964ef6b11cb04af717-merged.mount: Deactivated successfully.
Dec 01 09:41:48 compute-0 podman[267726]: 2025-12-01 09:41:48.072612638 +0000 UTC m=+1.673391130 container remove 51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_goodall, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Dec 01 09:41:48 compute-0 systemd[1]: libpod-conmon-51f37142de03025673be3fa61cadc63455b47d47b6299be52b8ef5aa7fdc8f34.scope: Deactivated successfully.
Dec 01 09:41:48 compute-0 sudo[267622]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:48 compute-0 sudo[267785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:48 compute-0 sudo[267785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:48 compute-0 sudo[267785]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:48 compute-0 sudo[267810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:41:48 compute-0 sudo[267810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:48 compute-0 sudo[267810]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:48 compute-0 sudo[267835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:48 compute-0 sudo[267835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:48 compute-0 sudo[267835]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:48 compute-0 sudo[267860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:41:48 compute-0 sudo[267860]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:48 compute-0 podman[267925]: 2025-12-01 09:41:48.835446719 +0000 UTC m=+0.061615952 container create 5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_agnesi, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:41:48 compute-0 systemd[1]: Started libpod-conmon-5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04.scope.
Dec 01 09:41:48 compute-0 podman[267925]: 2025-12-01 09:41:48.812711736 +0000 UTC m=+0.038881009 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:41:48 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:41:48 compute-0 podman[267925]: 2025-12-01 09:41:48.94267591 +0000 UTC m=+0.168845123 container init 5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_agnesi, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:41:48 compute-0 podman[267925]: 2025-12-01 09:41:48.955067106 +0000 UTC m=+0.181236299 container start 5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_agnesi, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 01 09:41:48 compute-0 podman[267925]: 2025-12-01 09:41:48.959516114 +0000 UTC m=+0.185685327 container attach 5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_agnesi, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:41:48 compute-0 great_agnesi[267941]: 167 167
Dec 01 09:41:48 compute-0 systemd[1]: libpod-5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04.scope: Deactivated successfully.
Dec 01 09:41:48 compute-0 conmon[267941]: conmon 5a446200b6ba360fc01e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04.scope/container/memory.events
Dec 01 09:41:48 compute-0 podman[267925]: 2025-12-01 09:41:48.964639532 +0000 UTC m=+0.190808715 container died 5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_agnesi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:41:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-aff9fbdd6598b1215bd04bc8c457e7e6de7ad0dedd0b340124bc0756fd3c35c6-merged.mount: Deactivated successfully.
Dec 01 09:41:48 compute-0 podman[267925]: 2025-12-01 09:41:48.999709819 +0000 UTC m=+0.225879022 container remove 5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_agnesi, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:41:49 compute-0 systemd[1]: libpod-conmon-5a446200b6ba360fc01e490cea7b38c269bc392c05cb9cf81886f912fe8caf04.scope: Deactivated successfully.
Dec 01 09:41:49 compute-0 podman[267966]: 2025-12-01 09:41:49.204360221 +0000 UTC m=+0.050092151 container create 2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_murdock, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:41:49 compute-0 systemd[1]: Started libpod-conmon-2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4.scope.
Dec 01 09:41:49 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:41:49 compute-0 podman[267966]: 2025-12-01 09:41:49.178321022 +0000 UTC m=+0.024052962 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:41:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dc3245e95f6a217fba6bb9df0ea12aff2c945a6760f8741e7d77d3d6d83a478/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dc3245e95f6a217fba6bb9df0ea12aff2c945a6760f8741e7d77d3d6d83a478/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dc3245e95f6a217fba6bb9df0ea12aff2c945a6760f8741e7d77d3d6d83a478/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dc3245e95f6a217fba6bb9df0ea12aff2c945a6760f8741e7d77d3d6d83a478/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:49 compute-0 podman[267966]: 2025-12-01 09:41:49.298538017 +0000 UTC m=+0.144270007 container init 2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_murdock, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:41:49 compute-0 podman[267966]: 2025-12-01 09:41:49.309736899 +0000 UTC m=+0.155468789 container start 2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_murdock, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:41:49 compute-0 podman[267966]: 2025-12-01 09:41:49.313145617 +0000 UTC m=+0.158877557 container attach 2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_murdock, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:41:49 compute-0 ceph-mon[75031]: pgmap v938: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v939: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]: {
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:     "0": [
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:         {
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "devices": [
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "/dev/loop3"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             ],
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_name": "ceph_lv0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_size": "21470642176",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "name": "ceph_lv0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "tags": {
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cluster_name": "ceph",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.crush_device_class": "",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.encrypted": "0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osd_id": "0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.type": "block",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.vdo": "0"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             },
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "type": "block",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "vg_name": "ceph_vg0"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:         }
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:     ],
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:     "1": [
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:         {
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "devices": [
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "/dev/loop4"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             ],
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_name": "ceph_lv1",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_size": "21470642176",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "name": "ceph_lv1",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "tags": {
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cluster_name": "ceph",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.crush_device_class": "",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.encrypted": "0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osd_id": "1",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.type": "block",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.vdo": "0"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             },
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "type": "block",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "vg_name": "ceph_vg1"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:         }
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:     ],
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:     "2": [
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:         {
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "devices": [
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "/dev/loop5"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             ],
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_name": "ceph_lv2",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_size": "21470642176",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "name": "ceph_lv2",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "tags": {
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.cluster_name": "ceph",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.crush_device_class": "",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.encrypted": "0",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osd_id": "2",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.type": "block",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:                 "ceph.vdo": "0"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             },
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "type": "block",
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:             "vg_name": "ceph_vg2"
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:         }
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]:     ]
Dec 01 09:41:50 compute-0 relaxed_murdock[267983]: }
Dec 01 09:41:50 compute-0 systemd[1]: libpod-2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4.scope: Deactivated successfully.
Dec 01 09:41:50 compute-0 podman[267966]: 2025-12-01 09:41:50.177660851 +0000 UTC m=+1.023392741 container died 2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_murdock, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:41:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-5dc3245e95f6a217fba6bb9df0ea12aff2c945a6760f8741e7d77d3d6d83a478-merged.mount: Deactivated successfully.
Dec 01 09:41:50 compute-0 podman[267966]: 2025-12-01 09:41:50.241931808 +0000 UTC m=+1.087663698 container remove 2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_murdock, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:41:50 compute-0 systemd[1]: libpod-conmon-2d63cabf0ac2d551c1131017192dc84a091860d4b963ec02ff9500cefa00c3e4.scope: Deactivated successfully.
Dec 01 09:41:50 compute-0 sudo[267860]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:50 compute-0 sudo[268002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:50 compute-0 sudo[268002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:50 compute-0 sudo[268002]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:50 compute-0 sudo[268027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:41:50 compute-0 sudo[268027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:50 compute-0 sudo[268027]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:50 compute-0 sudo[268052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:50 compute-0 sudo[268052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:50 compute-0 sudo[268052]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:50 compute-0 sudo[268077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:41:50 compute-0 sudo[268077]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:50 compute-0 podman[268142]: 2025-12-01 09:41:50.809980542 +0000 UTC m=+0.040025401 container create 0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_shannon, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:41:50 compute-0 systemd[1]: Started libpod-conmon-0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3.scope.
Dec 01 09:41:50 compute-0 podman[268142]: 2025-12-01 09:41:50.794810916 +0000 UTC m=+0.024855775 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:41:50 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:41:50 compute-0 podman[268142]: 2025-12-01 09:41:50.911170949 +0000 UTC m=+0.141215828 container init 0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_shannon, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:41:50 compute-0 podman[268142]: 2025-12-01 09:41:50.918124009 +0000 UTC m=+0.148168858 container start 0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_shannon, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:41:50 compute-0 podman[268142]: 2025-12-01 09:41:50.921063043 +0000 UTC m=+0.151107992 container attach 0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_shannon, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:41:50 compute-0 competent_shannon[268158]: 167 167
Dec 01 09:41:50 compute-0 systemd[1]: libpod-0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3.scope: Deactivated successfully.
Dec 01 09:41:50 compute-0 podman[268142]: 2025-12-01 09:41:50.926793948 +0000 UTC m=+0.156838807 container died 0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_shannon, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:41:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-95d180b650abd6518064b0f7b2dc472f4f63bff14d0d1758f8aba197f0a8769a-merged.mount: Deactivated successfully.
Dec 01 09:41:50 compute-0 podman[268142]: 2025-12-01 09:41:50.962456903 +0000 UTC m=+0.192501762 container remove 0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=competent_shannon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 09:41:50 compute-0 systemd[1]: libpod-conmon-0c2133b94933b9f30add616838e635bdae605ab21e3f0afd2db14b5225a554f3.scope: Deactivated successfully.
Dec 01 09:41:51 compute-0 podman[268181]: 2025-12-01 09:41:51.145993868 +0000 UTC m=+0.044903992 container create a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_taussig, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:41:51 compute-0 systemd[1]: Started libpod-conmon-a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104.scope.
Dec 01 09:41:51 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:41:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2bfdca299e9739a1f0907eef33da340c6f5ce7fa6a9f23daba5d9d0f581a201e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2bfdca299e9739a1f0907eef33da340c6f5ce7fa6a9f23daba5d9d0f581a201e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2bfdca299e9739a1f0907eef33da340c6f5ce7fa6a9f23daba5d9d0f581a201e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2bfdca299e9739a1f0907eef33da340c6f5ce7fa6a9f23daba5d9d0f581a201e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:41:51 compute-0 podman[268181]: 2025-12-01 09:41:51.125130778 +0000 UTC m=+0.024040932 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:41:51 compute-0 podman[268181]: 2025-12-01 09:41:51.222805795 +0000 UTC m=+0.121715959 container init a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_taussig, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:41:51 compute-0 podman[268181]: 2025-12-01 09:41:51.230885577 +0000 UTC m=+0.129795711 container start a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_taussig, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:41:51 compute-0 podman[268181]: 2025-12-01 09:41:51.234423359 +0000 UTC m=+0.133333493 container attach a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_taussig, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:41:51 compute-0 ceph-mon[75031]: pgmap v939: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v940: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]: {
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "osd_id": 0,
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "type": "bluestore"
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:     },
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "osd_id": 1,
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "type": "bluestore"
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:     },
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "osd_id": 2,
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:         "type": "bluestore"
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]:     }
Dec 01 09:41:52 compute-0 flamboyant_taussig[268198]: }
Dec 01 09:41:52 compute-0 systemd[1]: libpod-a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104.scope: Deactivated successfully.
Dec 01 09:41:52 compute-0 podman[268181]: 2025-12-01 09:41:52.299882118 +0000 UTC m=+1.198792292 container died a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_taussig, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:41:52 compute-0 systemd[1]: libpod-a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104.scope: Consumed 1.077s CPU time.
Dec 01 09:41:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-2bfdca299e9739a1f0907eef33da340c6f5ce7fa6a9f23daba5d9d0f581a201e-merged.mount: Deactivated successfully.
Dec 01 09:41:52 compute-0 podman[268181]: 2025-12-01 09:41:52.367754158 +0000 UTC m=+1.266664292 container remove a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_taussig, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:41:52 compute-0 systemd[1]: libpod-conmon-a49ae1918151cc46a346162a85f9b7da315c3f2c03df9e76729132396e614104.scope: Deactivated successfully.
Dec 01 09:41:52 compute-0 sudo[268077]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:41:52 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:41:52 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:41:52 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:41:52 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 94fe13ad-898e-4db0-9303-c72b7568057d does not exist
Dec 01 09:41:52 compute-0 sudo[268245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:41:52 compute-0 sudo[268245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:52 compute-0 sudo[268245]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:52 compute-0 sudo[268270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:41:52 compute-0 sudo[268270]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:41:52 compute-0 sudo[268270]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:53 compute-0 ceph-mon[75031]: pgmap v940: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:53 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:41:53 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:41:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v941: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:55 compute-0 ceph-mon[75031]: pgmap v941: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:41:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v942: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:57 compute-0 ceph-mon[75031]: pgmap v942: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v943: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:59 compute-0 podman[268295]: 2025-12-01 09:41:59.012437308 +0000 UTC m=+0.097643087 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd)
Dec 01 09:41:59 compute-0 ceph-mon[75031]: pgmap v943: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:41:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v944: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:01 compute-0 ceph-mon[75031]: pgmap v944: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v945: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:03 compute-0 ceph-mon[75031]: pgmap v945: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v946: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:04 compute-0 ceph-mon[75031]: pgmap v946: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v947: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:06 compute-0 ceph-mon[75031]: pgmap v947: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v948: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:08 compute-0 ceph-mon[75031]: pgmap v948: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:09 compute-0 podman[268316]: 2025-12-01 09:42:09.025147407 +0000 UTC m=+0.120388731 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 01 09:42:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v949: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:10 compute-0 ceph-mon[75031]: pgmap v949: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v950: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:12 compute-0 podman[268342]: 2025-12-01 09:42:12.949948846 +0000 UTC m=+0.056149625 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 01 09:42:12 compute-0 ceph-mon[75031]: pgmap v950: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:42:13
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'vms', '.mgr', 'volumes', 'cephfs.cephfs.data', 'images', 'cephfs.cephfs.meta']
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:42:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v951: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:15 compute-0 ceph-mon[75031]: pgmap v951: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v952: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:17 compute-0 ceph-mon[75031]: pgmap v952: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v953: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:42:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:42:19 compute-0 ceph-mon[75031]: pgmap v953: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v954: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:42:20.481 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:42:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:42:20.481 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:42:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:42:20.481 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:42:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:21 compute-0 ceph-mon[75031]: pgmap v954: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v955: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:42:22 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 4491 writes, 19K keys, 4491 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4491 writes, 4491 syncs, 1.00 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1487 writes, 7032 keys, 1487 commit groups, 1.0 writes per commit group, ingest: 6.42 MB, 0.01 MB/s
                                           Interval WAL: 1487 writes, 1487 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     91.9      0.17              0.06        11    0.016       0      0       0.0       0.0
                                             L6      1/0    5.13 MB   0.0      0.1     0.0      0.0       0.0      0.0       0.0   3.1    147.3    121.0      0.40              0.18        10    0.040     37K   5313       0.0       0.0
                                            Sum      1/0    5.13 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.1    103.4    112.3      0.57              0.24        21    0.027     37K   5313       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.3    104.1    106.4      0.29              0.12        10    0.029     21K   3042       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.0      0.0       0.0   0.0    147.3    121.0      0.40              0.18        10    0.040     37K   5313       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     93.0      0.17              0.06        10    0.017       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     19.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.015, interval 0.006
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.06 GB write, 0.04 MB/s write, 0.06 GB read, 0.03 MB/s read, 0.6 seconds
                                           Interval compaction: 0.03 GB write, 0.05 MB/s write, 0.03 GB read, 0.05 MB/s read, 0.3 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55bbd56b51f0#2 capacity: 308.00 MB usage: 5.32 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000123 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(537,5.01 MB,1.62791%) FilterBlock(22,110.55 KB,0.0350506%) IndexBlock(22,198.03 KB,0.0627889%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 09:42:23 compute-0 ceph-mon[75031]: pgmap v955: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v956: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:25 compute-0 ceph-mon[75031]: pgmap v956: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v957: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:27 compute-0 nova_compute[250706]: 2025-12-01 09:42:27.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:27 compute-0 nova_compute[250706]: 2025-12-01 09:42:27.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 09:42:27 compute-0 ceph-mon[75031]: pgmap v957: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v958: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:29 compute-0 ceph-mon[75031]: pgmap v958: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v959: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:29 compute-0 podman[268359]: 2025-12-01 09:42:29.956088667 +0000 UTC m=+0.065073361 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 01 09:42:30 compute-0 ceph-mon[75031]: pgmap v959: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v960: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:33 compute-0 ceph-mon[75031]: pgmap v960: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v961: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:34 compute-0 nova_compute[250706]: 2025-12-01 09:42:34.062 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:34 compute-0 nova_compute[250706]: 2025-12-01 09:42:34.062 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:35 compute-0 nova_compute[250706]: 2025-12-01 09:42:35.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:35 compute-0 nova_compute[250706]: 2025-12-01 09:42:35.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:42:35 compute-0 nova_compute[250706]: 2025-12-01 09:42:35.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:42:35 compute-0 nova_compute[250706]: 2025-12-01 09:42:35.075 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:42:35 compute-0 nova_compute[250706]: 2025-12-01 09:42:35.076 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:35 compute-0 ceph-mon[75031]: pgmap v961: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v962: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:36 compute-0 nova_compute[250706]: 2025-12-01 09:42:36.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:36 compute-0 nova_compute[250706]: 2025-12-01 09:42:36.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:36 compute-0 nova_compute[250706]: 2025-12-01 09:42:36.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 09:42:36 compute-0 nova_compute[250706]: 2025-12-01 09:42:36.075 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 09:42:36 compute-0 nova_compute[250706]: 2025-12-01 09:42:36.076 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:37 compute-0 nova_compute[250706]: 2025-12-01 09:42:37.087 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:37 compute-0 nova_compute[250706]: 2025-12-01 09:42:37.113 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:37 compute-0 nova_compute[250706]: 2025-12-01 09:42:37.114 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:37 compute-0 ceph-mon[75031]: pgmap v962: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v963: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:38 compute-0 nova_compute[250706]: 2025-12-01 09:42:38.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:38 compute-0 nova_compute[250706]: 2025-12-01 09:42:38.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:42:39 compute-0 ceph-mon[75031]: pgmap v963: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v964: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:40 compute-0 podman[268379]: 2025-12-01 09:42:40.019857753 +0000 UTC m=+0.112979148 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.114 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.115 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.115 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.116 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.117 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:42:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:42:40 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/705532966' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.575 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:42:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.715 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.716 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5184MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.717 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:42:40 compute-0 nova_compute[250706]: 2025-12-01 09:42:40.717 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.130 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.131 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.273 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing inventories for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 09:42:41 compute-0 ceph-mon[75031]: pgmap v964: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:41 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/705532966' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.399 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating ProviderTree inventory for provider 847e3dbe-0f76-4032-a374-8c965945c22f from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.400 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.423 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing aggregate associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.455 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing trait associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, traits: COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,HW_CPU_X86_AESNI,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_BMI2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.471 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:42:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v965: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:42:41 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4207099861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.946 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.955 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.977 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.981 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:42:41 compute-0 nova_compute[250706]: 2025-12-01 09:42:41.982 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:42:42 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4207099861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:42:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:42:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:42:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:42:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:42:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:42:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:42:43 compute-0 ceph-mon[75031]: pgmap v965: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v966: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:43 compute-0 podman[268448]: 2025-12-01 09:42:43.969699251 +0000 UTC m=+0.068675564 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:42:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:42:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2704639101' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:42:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:42:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2704639101' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:42:45 compute-0 ceph-mon[75031]: pgmap v966: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2704639101' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:42:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/2704639101' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:42:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v967: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:47 compute-0 ceph-mon[75031]: pgmap v967: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v968: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:49 compute-0 ceph-mon[75031]: pgmap v968: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v969: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:51 compute-0 ceph-mon[75031]: pgmap v969: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:51 compute-0 nova_compute[250706]: 2025-12-01 09:42:51.800 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:42:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v970: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:52 compute-0 sudo[268467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:52 compute-0 sudo[268467]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:52 compute-0 sudo[268467]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:52 compute-0 sudo[268492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:42:52 compute-0 sudo[268492]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:52 compute-0 sudo[268492]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:52 compute-0 sudo[268517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:52 compute-0 sudo[268517]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:52 compute-0 sudo[268517]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:52 compute-0 sudo[268542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:42:52 compute-0 sudo[268542]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:53 compute-0 ceph-mon[75031]: pgmap v970: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:53 compute-0 sudo[268542]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:42:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:42:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:42:53 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:42:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:42:53 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:42:53 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 16365d02-e8a0-4429-a516-de704f982ae9 does not exist
Dec 01 09:42:53 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 60d904bf-51e1-43e7-b466-3b4a62d111be does not exist
Dec 01 09:42:53 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 3f694ba4-86fe-4916-86f6-9755cf590135 does not exist
Dec 01 09:42:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:42:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:42:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:42:53 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:42:53 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:42:53 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:42:53 compute-0 sudo[268599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:53 compute-0 sudo[268599]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:53 compute-0 sudo[268599]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:53 compute-0 sudo[268624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:42:53 compute-0 sudo[268624]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:53 compute-0 sudo[268624]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:53 compute-0 sudo[268649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:53 compute-0 sudo[268649]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:53 compute-0 sudo[268649]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v971: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:53 compute-0 sudo[268674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:42:53 compute-0 sudo[268674]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:54 compute-0 podman[268737]: 2025-12-01 09:42:54.230006215 +0000 UTC m=+0.054473476 container create f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_hodgkin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef)
Dec 01 09:42:54 compute-0 systemd[1]: Started libpod-conmon-f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251.scope.
Dec 01 09:42:54 compute-0 podman[268737]: 2025-12-01 09:42:54.20686845 +0000 UTC m=+0.031335681 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:42:54 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:42:54 compute-0 podman[268737]: 2025-12-01 09:42:54.323934955 +0000 UTC m=+0.148402216 container init f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_hodgkin, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:42:54 compute-0 podman[268737]: 2025-12-01 09:42:54.335896488 +0000 UTC m=+0.160363749 container start f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_hodgkin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:42:54 compute-0 podman[268737]: 2025-12-01 09:42:54.340025827 +0000 UTC m=+0.164493098 container attach f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_hodgkin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:42:54 compute-0 tender_hodgkin[268754]: 167 167
Dec 01 09:42:54 compute-0 systemd[1]: libpod-f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251.scope: Deactivated successfully.
Dec 01 09:42:54 compute-0 podman[268737]: 2025-12-01 09:42:54.347595994 +0000 UTC m=+0.172063225 container died f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_hodgkin, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:42:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-1be581f34e6483271c4055ae8cb7dfe7e0d6e388c3514495ef760c9826d200c6-merged.mount: Deactivated successfully.
Dec 01 09:42:54 compute-0 podman[268737]: 2025-12-01 09:42:54.389804208 +0000 UTC m=+0.214271429 container remove f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=tender_hodgkin, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:42:54 compute-0 systemd[1]: libpod-conmon-f701234a8b34dee0e533ff523dffbbc0157b9daa8877f88f04c4d86762324251.scope: Deactivated successfully.
Dec 01 09:42:54 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:42:54 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:42:54 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:42:54 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:42:54 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:42:54 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:42:54 compute-0 podman[268777]: 2025-12-01 09:42:54.649269244 +0000 UTC m=+0.078520948 container create d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_mendel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:42:54 compute-0 systemd[1]: Started libpod-conmon-d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef.scope.
Dec 01 09:42:54 compute-0 podman[268777]: 2025-12-01 09:42:54.61783287 +0000 UTC m=+0.047084624 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:42:54 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:42:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51f232011849cefdf52e80b7b34ba5ca37bc7b3962b42e125ac8416594039e28/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51f232011849cefdf52e80b7b34ba5ca37bc7b3962b42e125ac8416594039e28/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51f232011849cefdf52e80b7b34ba5ca37bc7b3962b42e125ac8416594039e28/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51f232011849cefdf52e80b7b34ba5ca37bc7b3962b42e125ac8416594039e28/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51f232011849cefdf52e80b7b34ba5ca37bc7b3962b42e125ac8416594039e28/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:54 compute-0 podman[268777]: 2025-12-01 09:42:54.76187347 +0000 UTC m=+0.191125224 container init d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_mendel, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:42:54 compute-0 podman[268777]: 2025-12-01 09:42:54.777687004 +0000 UTC m=+0.206938708 container start d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_mendel, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:42:54 compute-0 podman[268777]: 2025-12-01 09:42:54.782207844 +0000 UTC m=+0.211459558 container attach d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_mendel, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:42:55 compute-0 ceph-mon[75031]: pgmap v971: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:42:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v972: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:56 compute-0 relaxed_mendel[268793]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:42:56 compute-0 relaxed_mendel[268793]: --> relative data size: 1.0
Dec 01 09:42:56 compute-0 relaxed_mendel[268793]: --> All data devices are unavailable
Dec 01 09:42:56 compute-0 systemd[1]: libpod-d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef.scope: Deactivated successfully.
Dec 01 09:42:56 compute-0 systemd[1]: libpod-d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef.scope: Consumed 1.198s CPU time.
Dec 01 09:42:56 compute-0 podman[268777]: 2025-12-01 09:42:56.038364522 +0000 UTC m=+1.467616226 container died d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_mendel, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:42:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-51f232011849cefdf52e80b7b34ba5ca37bc7b3962b42e125ac8416594039e28-merged.mount: Deactivated successfully.
Dec 01 09:42:56 compute-0 podman[268777]: 2025-12-01 09:42:56.111843994 +0000 UTC m=+1.541095668 container remove d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_mendel, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Dec 01 09:42:56 compute-0 systemd[1]: libpod-conmon-d85e088b50f0286358774532b9580d23915f4311d303e30666a94f3af36f6fef.scope: Deactivated successfully.
Dec 01 09:42:56 compute-0 sudo[268674]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:56 compute-0 sudo[268833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:56 compute-0 sudo[268833]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:56 compute-0 sudo[268833]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:56 compute-0 sudo[268858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:42:56 compute-0 sudo[268858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:56 compute-0 sudo[268858]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:56 compute-0 sudo[268883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:56 compute-0 sudo[268883]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:56 compute-0 sudo[268883]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:56 compute-0 sudo[268908]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:42:56 compute-0 sudo[268908]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:56 compute-0 podman[268975]: 2025-12-01 09:42:56.753762191 +0000 UTC m=+0.055333622 container create 552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_sanderson, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:42:56 compute-0 systemd[1]: Started libpod-conmon-552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628.scope.
Dec 01 09:42:56 compute-0 podman[268975]: 2025-12-01 09:42:56.727163426 +0000 UTC m=+0.028734897 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:42:56 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:42:56 compute-0 podman[268975]: 2025-12-01 09:42:56.877285261 +0000 UTC m=+0.178856702 container init 552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_sanderson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:42:56 compute-0 podman[268975]: 2025-12-01 09:42:56.884202089 +0000 UTC m=+0.185773480 container start 552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_sanderson, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:42:56 compute-0 podman[268975]: 2025-12-01 09:42:56.887586126 +0000 UTC m=+0.189157647 container attach 552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_sanderson, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Dec 01 09:42:56 compute-0 dreamy_sanderson[268991]: 167 167
Dec 01 09:42:56 compute-0 systemd[1]: libpod-552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628.scope: Deactivated successfully.
Dec 01 09:42:56 compute-0 podman[268975]: 2025-12-01 09:42:56.890371366 +0000 UTC m=+0.191942787 container died 552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_sanderson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef)
Dec 01 09:42:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-413cf90f2dc60ea86e55d6bfb190fa9b4b016ff7500c44517e902b76be08b7f2-merged.mount: Deactivated successfully.
Dec 01 09:42:56 compute-0 podman[268975]: 2025-12-01 09:42:56.931443297 +0000 UTC m=+0.233014688 container remove 552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_sanderson, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:42:56 compute-0 systemd[1]: libpod-conmon-552d934ac71a052b974c4a78ee5cf15414cb92a8047c2562d1d1bad887b96628.scope: Deactivated successfully.
Dec 01 09:42:57 compute-0 podman[269013]: 2025-12-01 09:42:57.127685886 +0000 UTC m=+0.052647244 container create 78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jepsen, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:42:57 compute-0 systemd[1]: Started libpod-conmon-78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7.scope.
Dec 01 09:42:57 compute-0 podman[269013]: 2025-12-01 09:42:57.099366293 +0000 UTC m=+0.024327731 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:42:57 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:42:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcc1ca4fe3b70283bb634e8bcc1b187afa174a20056a322112a669d21a8e5970/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcc1ca4fe3b70283bb634e8bcc1b187afa174a20056a322112a669d21a8e5970/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcc1ca4fe3b70283bb634e8bcc1b187afa174a20056a322112a669d21a8e5970/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcc1ca4fe3b70283bb634e8bcc1b187afa174a20056a322112a669d21a8e5970/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:57 compute-0 podman[269013]: 2025-12-01 09:42:57.217709923 +0000 UTC m=+0.142671371 container init 78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jepsen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:42:57 compute-0 podman[269013]: 2025-12-01 09:42:57.23219179 +0000 UTC m=+0.157153178 container start 78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jepsen, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:42:57 compute-0 podman[269013]: 2025-12-01 09:42:57.236166164 +0000 UTC m=+0.161127602 container attach 78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jepsen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:42:57 compute-0 ceph-mon[75031]: pgmap v972: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v973: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]: {
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:     "0": [
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:         {
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "devices": [
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "/dev/loop3"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             ],
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_name": "ceph_lv0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_size": "21470642176",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "name": "ceph_lv0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "tags": {
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cluster_name": "ceph",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.crush_device_class": "",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.encrypted": "0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osd_id": "0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.type": "block",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.vdo": "0"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             },
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "type": "block",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "vg_name": "ceph_vg0"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:         }
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:     ],
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:     "1": [
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:         {
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "devices": [
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "/dev/loop4"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             ],
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_name": "ceph_lv1",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_size": "21470642176",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "name": "ceph_lv1",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "tags": {
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cluster_name": "ceph",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.crush_device_class": "",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.encrypted": "0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osd_id": "1",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.type": "block",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.vdo": "0"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             },
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "type": "block",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "vg_name": "ceph_vg1"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:         }
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:     ],
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:     "2": [
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:         {
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "devices": [
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "/dev/loop5"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             ],
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_name": "ceph_lv2",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_size": "21470642176",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "name": "ceph_lv2",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "tags": {
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.cluster_name": "ceph",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.crush_device_class": "",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.encrypted": "0",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osd_id": "2",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.type": "block",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:                 "ceph.vdo": "0"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             },
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "type": "block",
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:             "vg_name": "ceph_vg2"
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:         }
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]:     ]
Dec 01 09:42:57 compute-0 zealous_jepsen[269029]: }
Dec 01 09:42:57 compute-0 systemd[1]: libpod-78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7.scope: Deactivated successfully.
Dec 01 09:42:57 compute-0 podman[269013]: 2025-12-01 09:42:57.99463741 +0000 UTC m=+0.919598798 container died 78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jepsen, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Dec 01 09:42:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-dcc1ca4fe3b70283bb634e8bcc1b187afa174a20056a322112a669d21a8e5970-merged.mount: Deactivated successfully.
Dec 01 09:42:58 compute-0 podman[269013]: 2025-12-01 09:42:58.062400118 +0000 UTC m=+0.987361476 container remove 78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_jepsen, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:42:58 compute-0 systemd[1]: libpod-conmon-78a181a56c4c61cc4939a668973a8a837feb6b7f20f91a420314a4a2b7ad19c7.scope: Deactivated successfully.
Dec 01 09:42:58 compute-0 sudo[268908]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:58 compute-0 sudo[269052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:58 compute-0 sudo[269052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:58 compute-0 sudo[269052]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:58 compute-0 sudo[269077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:42:58 compute-0 sudo[269077]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:58 compute-0 sudo[269077]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:58 compute-0 sudo[269102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:42:58 compute-0 sudo[269102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:58 compute-0 sudo[269102]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:58 compute-0 sudo[269127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:42:58 compute-0 sudo[269127]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:42:58 compute-0 podman[269192]: 2025-12-01 09:42:58.776311324 +0000 UTC m=+0.040159845 container create c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_meitner, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True)
Dec 01 09:42:58 compute-0 systemd[1]: Started libpod-conmon-c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba.scope.
Dec 01 09:42:58 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:42:58 compute-0 podman[269192]: 2025-12-01 09:42:58.849675472 +0000 UTC m=+0.113524043 container init c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_meitner, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:42:58 compute-0 podman[269192]: 2025-12-01 09:42:58.758167312 +0000 UTC m=+0.022015843 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:42:58 compute-0 podman[269192]: 2025-12-01 09:42:58.856861609 +0000 UTC m=+0.120710130 container start c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_meitner, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:42:58 compute-0 podman[269192]: 2025-12-01 09:42:58.860444492 +0000 UTC m=+0.124293023 container attach c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_meitner, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:42:58 compute-0 elastic_meitner[269208]: 167 167
Dec 01 09:42:58 compute-0 systemd[1]: libpod-c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba.scope: Deactivated successfully.
Dec 01 09:42:58 compute-0 podman[269192]: 2025-12-01 09:42:58.862841001 +0000 UTC m=+0.126689522 container died c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_meitner, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:42:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-ecf0aba702be952da24c0ec3a660d4f2130d42ee7ee6d01fcca912ab745b93bb-merged.mount: Deactivated successfully.
Dec 01 09:42:58 compute-0 podman[269192]: 2025-12-01 09:42:58.902792728 +0000 UTC m=+0.166641239 container remove c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_meitner, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef)
Dec 01 09:42:58 compute-0 systemd[1]: libpod-conmon-c9ed935af98772ecdb1a8f40f6b6e7e28adf19ecfe08c9c12a06f2b85c534dba.scope: Deactivated successfully.
Dec 01 09:42:59 compute-0 podman[269232]: 2025-12-01 09:42:59.051309016 +0000 UTC m=+0.042804481 container create efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_archimedes, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Dec 01 09:42:59 compute-0 systemd[1]: Started libpod-conmon-efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800.scope.
Dec 01 09:42:59 compute-0 podman[269232]: 2025-12-01 09:42:59.031068154 +0000 UTC m=+0.022563639 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:42:59 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:42:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95059fb3ecf5f1f5e75a2c680ed28b6b046dd0a90dd1c1410f9a277dcbad717e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95059fb3ecf5f1f5e75a2c680ed28b6b046dd0a90dd1c1410f9a277dcbad717e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95059fb3ecf5f1f5e75a2c680ed28b6b046dd0a90dd1c1410f9a277dcbad717e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95059fb3ecf5f1f5e75a2c680ed28b6b046dd0a90dd1c1410f9a277dcbad717e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:42:59 compute-0 podman[269232]: 2025-12-01 09:42:59.150865947 +0000 UTC m=+0.142361462 container init efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_archimedes, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:42:59 compute-0 podman[269232]: 2025-12-01 09:42:59.165858248 +0000 UTC m=+0.157353703 container start efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_archimedes, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2)
Dec 01 09:42:59 compute-0 podman[269232]: 2025-12-01 09:42:59.172111227 +0000 UTC m=+0.163606782 container attach efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_archimedes, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:42:59 compute-0 ceph-mon[75031]: pgmap v973: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:42:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v974: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]: {
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "osd_id": 0,
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "type": "bluestore"
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:     },
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "osd_id": 1,
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "type": "bluestore"
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:     },
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "osd_id": 2,
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:         "type": "bluestore"
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]:     }
Dec 01 09:43:00 compute-0 condescending_archimedes[269249]: }
Dec 01 09:43:00 compute-0 systemd[1]: libpod-efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800.scope: Deactivated successfully.
Dec 01 09:43:00 compute-0 systemd[1]: libpod-efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800.scope: Consumed 1.085s CPU time.
Dec 01 09:43:00 compute-0 podman[269232]: 2025-12-01 09:43:00.247064129 +0000 UTC m=+1.238559594 container died efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_archimedes, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:43:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-95059fb3ecf5f1f5e75a2c680ed28b6b046dd0a90dd1c1410f9a277dcbad717e-merged.mount: Deactivated successfully.
Dec 01 09:43:00 compute-0 podman[269232]: 2025-12-01 09:43:00.316227266 +0000 UTC m=+1.307722721 container remove efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_archimedes, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True)
Dec 01 09:43:00 compute-0 systemd[1]: libpod-conmon-efab5852a0e221cd8ca1a19bc74f2c2f994b19811d420f81eaeb39e9c02f5800.scope: Deactivated successfully.
Dec 01 09:43:00 compute-0 sudo[269127]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:43:00 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:43:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:43:00 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:43:00 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev a491fa20-5905-4a53-bc00-838411234921 does not exist
Dec 01 09:43:00 compute-0 podman[269283]: 2025-12-01 09:43:00.368810918 +0000 UTC m=+0.085659833 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Dec 01 09:43:00 compute-0 sudo[269312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:43:00 compute-0 sudo[269312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:43:00 compute-0 sudo[269312]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:00 compute-0 sudo[269337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:43:00 compute-0 sudo[269337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:43:00 compute-0 sudo[269337]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:01 compute-0 ceph-mon[75031]: pgmap v974: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:01 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:43:01 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:43:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v975: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:03 compute-0 ceph-mon[75031]: pgmap v975: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v976: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:05 compute-0 ceph-mon[75031]: pgmap v976: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v977: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:07 compute-0 ceph-mon[75031]: pgmap v977: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v978: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #48. Immutable memtables: 0.
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.438678) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 48
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582188438863, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1460, "num_deletes": 251, "total_data_size": 1559509, "memory_usage": 1588464, "flush_reason": "Manual Compaction"}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #49: started
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582188454759, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 49, "file_size": 1518358, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19098, "largest_seqno": 20557, "table_properties": {"data_size": 1511565, "index_size": 3932, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14119, "raw_average_key_size": 19, "raw_value_size": 1497868, "raw_average_value_size": 2103, "num_data_blocks": 181, "num_entries": 712, "num_filter_entries": 712, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582036, "oldest_key_time": 1764582036, "file_creation_time": 1764582188, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 49, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 16843 microseconds, and 9567 cpu microseconds.
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.455584) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #49: 1518358 bytes OK
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.455632) [db/memtable_list.cc:519] [default] Level-0 commit table #49 started
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.457982) [db/memtable_list.cc:722] [default] Level-0 commit table #49: memtable #1 done
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.458017) EVENT_LOG_v1 {"time_micros": 1764582188458008, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.458043) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1553070, prev total WAL file size 1553070, number of live WAL files 2.
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000045.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.459060) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [49(1482KB)], [47(5257KB)]
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582188459098, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [49], "files_L6": [47], "score": -1, "input_data_size": 6901997, "oldest_snapshot_seqno": -1}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #50: 4008 keys, 5721325 bytes, temperature: kUnknown
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582188537535, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 50, "file_size": 5721325, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5692621, "index_size": 17586, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 96959, "raw_average_key_size": 24, "raw_value_size": 5618598, "raw_average_value_size": 1401, "num_data_blocks": 749, "num_entries": 4008, "num_filter_entries": 4008, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764582188, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.537899) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 5721325 bytes
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.562521) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.8 rd, 72.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 5.1 +0.0 blob) out(5.5 +0.0 blob), read-write-amplify(8.3) write-amplify(3.8) OK, records in: 4522, records dropped: 514 output_compression: NoCompression
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.562553) EVENT_LOG_v1 {"time_micros": 1764582188562538, "job": 24, "event": "compaction_finished", "compaction_time_micros": 78568, "compaction_time_cpu_micros": 18051, "output_level": 6, "num_output_files": 1, "total_output_size": 5721325, "num_input_records": 4522, "num_output_records": 4008, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000049.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582188563198, "job": 24, "event": "table_file_deletion", "file_number": 49}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582188565061, "job": 24, "event": "table_file_deletion", "file_number": 47}
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.458891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.565219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.565227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.565229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.565230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:43:08 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:43:08.565232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:43:09 compute-0 ceph-mon[75031]: pgmap v978: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v979: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:11 compute-0 podman[269362]: 2025-12-01 09:43:11.031417193 +0000 UTC m=+0.138092179 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 01 09:43:11 compute-0 ceph-mon[75031]: pgmap v979: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v980: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:43:13
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'vms', '.mgr', 'cephfs.cephfs.data', 'backups', 'images', 'volumes']
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:43:13 compute-0 ceph-mon[75031]: pgmap v980: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v981: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:13 compute-0 ceph-mgr[75324]: client.0 ms_handle_reset on v2:192.168.122.100:6800/3312476512
Dec 01 09:43:14 compute-0 podman[269390]: 2025-12-01 09:43:14.960257968 +0000 UTC m=+0.060007406 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 01 09:43:15 compute-0 ceph-mon[75031]: pgmap v981: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v982: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:17 compute-0 ceph-mon[75031]: pgmap v982: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v983: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:43:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:43:19 compute-0 ceph-mon[75031]: pgmap v983: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v984: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:43:20.481 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:43:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:43:20.482 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:43:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:43:20.482 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:43:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:21 compute-0 ceph-mon[75031]: pgmap v984: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v985: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:21 compute-0 nova_compute[250706]: 2025-12-01 09:43:21.914 250710 DEBUG oslo_concurrency.processutils [None req-66a801ae-3fb8-4b9c-8151-dc5046eb74f0 c3c7352263884d5d840a03c60f06801b 2ddc1b1c1d524d75ba341a005b659048 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:43:21 compute-0 nova_compute[250706]: 2025-12-01 09:43:21.946 250710 DEBUG oslo_concurrency.processutils [None req-66a801ae-3fb8-4b9c-8151-dc5046eb74f0 c3c7352263884d5d840a03c60f06801b 2ddc1b1c1d524d75ba341a005b659048 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:43:23 compute-0 ceph-mon[75031]: pgmap v985: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v986: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:25 compute-0 ceph-mon[75031]: pgmap v986: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v987: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:27 compute-0 ceph-mon[75031]: pgmap v987: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v988: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:29 compute-0 ceph-mon[75031]: pgmap v988: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v989: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:30 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:43:30.242 159899 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:9e:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '66:a0:73:58:3b:fd'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 09:43:30 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:43:30.244 159899 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 09:43:30 compute-0 ceph-mon[75031]: pgmap v989: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:30 compute-0 podman[269410]: 2025-12-01 09:43:30.954948963 +0000 UTC m=+0.061206620 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 01 09:43:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v990: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:32 compute-0 ceph-mon[75031]: pgmap v990: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v991: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:34 compute-0 nova_compute[250706]: 2025-12-01 09:43:34.076 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:34 compute-0 ceph-mon[75031]: pgmap v991: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:35 compute-0 nova_compute[250706]: 2025-12-01 09:43:35.048 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v992: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:36 compute-0 nova_compute[250706]: 2025-12-01 09:43:36.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:36 compute-0 nova_compute[250706]: 2025-12-01 09:43:36.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:43:36 compute-0 nova_compute[250706]: 2025-12-01 09:43:36.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:43:36 compute-0 nova_compute[250706]: 2025-12-01 09:43:36.081 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:43:36 compute-0 ceph-mon[75031]: pgmap v992: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:37 compute-0 nova_compute[250706]: 2025-12-01 09:43:37.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:37 compute-0 nova_compute[250706]: 2025-12-01 09:43:37.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v993: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:38 compute-0 nova_compute[250706]: 2025-12-01 09:43:38.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:38 compute-0 nova_compute[250706]: 2025-12-01 09:43:38.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:38 compute-0 nova_compute[250706]: 2025-12-01 09:43:38.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:43:38 compute-0 ceph-mon[75031]: pgmap v993: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:39 compute-0 nova_compute[250706]: 2025-12-01 09:43:39.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v994: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:40 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:43:40.247 159899 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a8013a17-6378-4c2f-a5de-9d3b29c7a42e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 09:43:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:40 compute-0 ceph-mon[75031]: pgmap v994: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.089 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.089 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.089 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.089 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.090 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:43:41 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:43:41 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/675853247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.557 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.786 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.788 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5166MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.789 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.789 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:43:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v995: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.873 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.873 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:43:41 compute-0 nova_compute[250706]: 2025-12-01 09:43:41.892 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:43:41 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/675853247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:43:42 compute-0 podman[269452]: 2025-12-01 09:43:42.025702997 +0000 UTC m=+0.121214804 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 01 09:43:42 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:43:42 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3737642234' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:43:42 compute-0 nova_compute[250706]: 2025-12-01 09:43:42.380 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:43:42 compute-0 nova_compute[250706]: 2025-12-01 09:43:42.386 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:43:42 compute-0 nova_compute[250706]: 2025-12-01 09:43:42.418 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:43:42 compute-0 nova_compute[250706]: 2025-12-01 09:43:42.421 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:43:42 compute-0 nova_compute[250706]: 2025-12-01 09:43:42.421 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:43:42 compute-0 ceph-mon[75031]: pgmap v995: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:42 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3737642234' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:43:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:43:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:43:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:43:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:43:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:43:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:43:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v996: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:43:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1788745724' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:43:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:43:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1788745724' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:43:44 compute-0 ceph-mon[75031]: pgmap v996: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1788745724' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:43:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1788745724' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:43:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v997: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:45 compute-0 podman[269500]: 2025-12-01 09:43:45.990717841 +0000 UTC m=+0.094381482 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 01 09:43:46 compute-0 ceph-mon[75031]: pgmap v997: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v998: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:48 compute-0 ceph-mon[75031]: pgmap v998: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v999: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:50 compute-0 ceph-mon[75031]: pgmap v999: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1000: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:53 compute-0 ceph-mon[75031]: pgmap v1000: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1001: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:55 compute-0 ceph-mon[75031]: pgmap v1001: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:43:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1002: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:57 compute-0 ceph-mon[75031]: pgmap v1002: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1003: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:59 compute-0 ceph-mon[75031]: pgmap v1003: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:43:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1004: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:00 compute-0 sudo[269519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:00 compute-0 sudo[269519]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:00 compute-0 sudo[269519]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:00 compute-0 sudo[269544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:44:00 compute-0 sudo[269544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:00 compute-0 sudo[269544]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:00 compute-0 sudo[269569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:00 compute-0 sudo[269569]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:00 compute-0 sudo[269569]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:00 compute-0 sudo[269594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Dec 01 09:44:00 compute-0 sudo[269594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:01 compute-0 ceph-mon[75031]: pgmap v1004: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:01 compute-0 sudo[269594]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:44:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:44:01 compute-0 podman[269635]: 2025-12-01 09:44:01.078286059 +0000 UTC m=+0.077955281 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 01 09:44:01 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:01 compute-0 sudo[269660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:01 compute-0 sudo[269660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:01 compute-0 sudo[269660]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:01 compute-0 sudo[269686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:44:01 compute-0 sudo[269686]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:01 compute-0 sudo[269686]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:01 compute-0 sudo[269711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:01 compute-0 sudo[269711]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:01 compute-0 sudo[269711]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:01 compute-0 sudo[269736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:44:01 compute-0 sudo[269736]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1005: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:01 compute-0 sudo[269736]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:01 compute-0 sudo[269793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:01 compute-0 sudo[269793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:01 compute-0 sudo[269793]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:02 compute-0 sudo[269818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:44:02 compute-0 sudo[269818]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:02 compute-0 sudo[269818]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:02 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:02 compute-0 sudo[269843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:02 compute-0 sudo[269843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:02 compute-0 sudo[269843]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:02 compute-0 sudo[269868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 list-networks
Dec 01 09:44:02 compute-0 sudo[269868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:02 compute-0 sudo[269868]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:02 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 59a61d99-a55a-4455-a6ee-5bb7a704fbbf does not exist
Dec 01 09:44:02 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 43a4b7e1-f2ac-49ed-870b-f81bfeeaed0f does not exist
Dec 01 09:44:02 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 036bf8f0-e02e-43ed-8b5e-0aca57c775ce does not exist
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:44:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:44:02 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:44:02 compute-0 sudo[269911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:02 compute-0 sudo[269911]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:02 compute-0 sudo[269911]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:02 compute-0 sudo[269936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:44:02 compute-0 sudo[269936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:02 compute-0 sudo[269936]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:02 compute-0 sudo[269961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:02 compute-0 sudo[269961]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:02 compute-0 sudo[269961]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:02 compute-0 sudo[269986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:44:02 compute-0 sudo[269986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:03 compute-0 podman[270052]: 2025-12-01 09:44:03.135396636 +0000 UTC m=+0.050252575 container create 02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_davinci, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Dec 01 09:44:03 compute-0 systemd[1]: Started libpod-conmon-02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6.scope.
Dec 01 09:44:03 compute-0 podman[270052]: 2025-12-01 09:44:03.11361622 +0000 UTC m=+0.028472179 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:44:03 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:44:03 compute-0 podman[270052]: 2025-12-01 09:44:03.246732625 +0000 UTC m=+0.161588564 container init 02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_davinci, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:44:03 compute-0 podman[270052]: 2025-12-01 09:44:03.260605344 +0000 UTC m=+0.175461263 container start 02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_davinci, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:44:03 compute-0 podman[270052]: 2025-12-01 09:44:03.264432214 +0000 UTC m=+0.179288153 container attach 02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_davinci, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:44:03 compute-0 infallible_davinci[270068]: 167 167
Dec 01 09:44:03 compute-0 systemd[1]: libpod-02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6.scope: Deactivated successfully.
Dec 01 09:44:03 compute-0 podman[270052]: 2025-12-01 09:44:03.271013953 +0000 UTC m=+0.185869912 container died 02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_davinci, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 09:44:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-d6b72bab5991849959c7a42491b9940babfb7457915684ee1ed888fa46854b1d-merged.mount: Deactivated successfully.
Dec 01 09:44:03 compute-0 podman[270052]: 2025-12-01 09:44:03.32137838 +0000 UTC m=+0.236234299 container remove 02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_davinci, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:44:03 compute-0 systemd[1]: libpod-conmon-02d662e3bdcfaf35a56d7323c01583ee7ec30a766b2d98066a22ee5fde5a7be6.scope: Deactivated successfully.
Dec 01 09:44:03 compute-0 ceph-mon[75031]: pgmap v1005: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:44:03 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:44:03 compute-0 podman[270095]: 2025-12-01 09:44:03.558191125 +0000 UTC m=+0.073214885 container create 5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_bhaskara, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:44:03 compute-0 systemd[1]: Started libpod-conmon-5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba.scope.
Dec 01 09:44:03 compute-0 podman[270095]: 2025-12-01 09:44:03.529048347 +0000 UTC m=+0.044072167 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:44:03 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:44:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a36585ed7733f19d9b3235140f5c47d936f021ea14b30648aa129e088d801c6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a36585ed7733f19d9b3235140f5c47d936f021ea14b30648aa129e088d801c6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a36585ed7733f19d9b3235140f5c47d936f021ea14b30648aa129e088d801c6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a36585ed7733f19d9b3235140f5c47d936f021ea14b30648aa129e088d801c6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a36585ed7733f19d9b3235140f5c47d936f021ea14b30648aa129e088d801c6/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:03 compute-0 podman[270095]: 2025-12-01 09:44:03.662996457 +0000 UTC m=+0.178020207 container init 5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_bhaskara, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:44:03 compute-0 podman[270095]: 2025-12-01 09:44:03.676659269 +0000 UTC m=+0.191682999 container start 5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_bhaskara, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:44:03 compute-0 podman[270095]: 2025-12-01 09:44:03.684539856 +0000 UTC m=+0.199563616 container attach 5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_bhaskara, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:44:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1006: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:04 compute-0 wizardly_bhaskara[270111]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:44:04 compute-0 wizardly_bhaskara[270111]: --> relative data size: 1.0
Dec 01 09:44:04 compute-0 wizardly_bhaskara[270111]: --> All data devices are unavailable
Dec 01 09:44:04 compute-0 systemd[1]: libpod-5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba.scope: Deactivated successfully.
Dec 01 09:44:04 compute-0 systemd[1]: libpod-5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba.scope: Consumed 1.195s CPU time.
Dec 01 09:44:04 compute-0 conmon[270111]: conmon 5183f1fe83d536fa01cb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba.scope/container/memory.events
Dec 01 09:44:04 compute-0 podman[270095]: 2025-12-01 09:44:04.905792672 +0000 UTC m=+1.420816462 container died 5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_bhaskara, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True)
Dec 01 09:44:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-8a36585ed7733f19d9b3235140f5c47d936f021ea14b30648aa129e088d801c6-merged.mount: Deactivated successfully.
Dec 01 09:44:04 compute-0 podman[270095]: 2025-12-01 09:44:04.9714872 +0000 UTC m=+1.486510960 container remove 5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=wizardly_bhaskara, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:44:04 compute-0 systemd[1]: libpod-conmon-5183f1fe83d536fa01cb5803920e40df59cd24f872a9fedb24e0da2028b2fdba.scope: Deactivated successfully.
Dec 01 09:44:05 compute-0 sudo[269986]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:05 compute-0 sudo[270152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:05 compute-0 sudo[270152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:05 compute-0 sudo[270152]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:05 compute-0 sudo[270177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:44:05 compute-0 sudo[270177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:05 compute-0 sudo[270177]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:05 compute-0 sudo[270202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:05 compute-0 sudo[270202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:05 compute-0 sudo[270202]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:05 compute-0 sudo[270227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:44:05 compute-0 sudo[270227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:05 compute-0 ceph-mon[75031]: pgmap v1006: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:05 compute-0 podman[270291]: 2025-12-01 09:44:05.730856113 +0000 UTC m=+0.039110255 container create 7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_lamarr, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:44:05 compute-0 systemd[1]: Started libpod-conmon-7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82.scope.
Dec 01 09:44:05 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:44:05 compute-0 podman[270291]: 2025-12-01 09:44:05.714551004 +0000 UTC m=+0.022805146 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:44:05 compute-0 podman[270291]: 2025-12-01 09:44:05.82192405 +0000 UTC m=+0.130178272 container init 7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_lamarr, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:44:05 compute-0 podman[270291]: 2025-12-01 09:44:05.831961848 +0000 UTC m=+0.140216030 container start 7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_lamarr, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Dec 01 09:44:05 compute-0 podman[270291]: 2025-12-01 09:44:05.836563471 +0000 UTC m=+0.144817653 container attach 7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_lamarr, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:44:05 compute-0 objective_lamarr[270308]: 167 167
Dec 01 09:44:05 compute-0 systemd[1]: libpod-7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82.scope: Deactivated successfully.
Dec 01 09:44:05 compute-0 conmon[270308]: conmon 7043e9b89b5bf221364f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82.scope/container/memory.events
Dec 01 09:44:05 compute-0 podman[270291]: 2025-12-01 09:44:05.841010558 +0000 UTC m=+0.149264740 container died 7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_lamarr, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:44:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1007: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-b34148366476cb4ee22895036ed5b0e703112f5982c3b07f0ffd5f2db05d2903-merged.mount: Deactivated successfully.
Dec 01 09:44:05 compute-0 podman[270291]: 2025-12-01 09:44:05.892418246 +0000 UTC m=+0.200672388 container remove 7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=objective_lamarr, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef)
Dec 01 09:44:05 compute-0 systemd[1]: libpod-conmon-7043e9b89b5bf221364f258ed4dc8650031dd876bb4793fb31be2897fd1e2e82.scope: Deactivated successfully.
Dec 01 09:44:06 compute-0 podman[270332]: 2025-12-01 09:44:06.065373326 +0000 UTC m=+0.060837589 container create 491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_leakey, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507)
Dec 01 09:44:06 compute-0 systemd[1]: Started libpod-conmon-491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9.scope.
Dec 01 09:44:06 compute-0 podman[270332]: 2025-12-01 09:44:06.032166072 +0000 UTC m=+0.027630415 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:44:06 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:44:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b089f6b8df3ef98acabe274ed0aa903921773524fbfd3dd8e6dfc23a5a2c04e4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b089f6b8df3ef98acabe274ed0aa903921773524fbfd3dd8e6dfc23a5a2c04e4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b089f6b8df3ef98acabe274ed0aa903921773524fbfd3dd8e6dfc23a5a2c04e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b089f6b8df3ef98acabe274ed0aa903921773524fbfd3dd8e6dfc23a5a2c04e4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:06 compute-0 podman[270332]: 2025-12-01 09:44:06.16468353 +0000 UTC m=+0.160147823 container init 491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_leakey, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:44:06 compute-0 podman[270332]: 2025-12-01 09:44:06.172074522 +0000 UTC m=+0.167538805 container start 491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_leakey, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True)
Dec 01 09:44:06 compute-0 podman[270332]: 2025-12-01 09:44:06.17580711 +0000 UTC m=+0.171271383 container attach 491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_leakey, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default)
Dec 01 09:44:06 compute-0 laughing_leakey[270349]: {
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:     "0": [
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:         {
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "devices": [
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "/dev/loop3"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             ],
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_name": "ceph_lv0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_size": "21470642176",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "name": "ceph_lv0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "tags": {
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cluster_name": "ceph",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.crush_device_class": "",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.encrypted": "0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osd_id": "0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.type": "block",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.vdo": "0"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             },
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "type": "block",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "vg_name": "ceph_vg0"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:         }
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:     ],
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:     "1": [
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:         {
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "devices": [
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "/dev/loop4"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             ],
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_name": "ceph_lv1",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_size": "21470642176",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "name": "ceph_lv1",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "tags": {
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cluster_name": "ceph",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.crush_device_class": "",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.encrypted": "0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osd_id": "1",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.type": "block",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.vdo": "0"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             },
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "type": "block",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "vg_name": "ceph_vg1"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:         }
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:     ],
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:     "2": [
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:         {
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "devices": [
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "/dev/loop5"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             ],
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_name": "ceph_lv2",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_size": "21470642176",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "name": "ceph_lv2",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "tags": {
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.cluster_name": "ceph",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.crush_device_class": "",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.encrypted": "0",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osd_id": "2",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.type": "block",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:                 "ceph.vdo": "0"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             },
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "type": "block",
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:             "vg_name": "ceph_vg2"
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:         }
Dec 01 09:44:06 compute-0 laughing_leakey[270349]:     ]
Dec 01 09:44:06 compute-0 laughing_leakey[270349]: }
Dec 01 09:44:06 compute-0 systemd[1]: libpod-491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9.scope: Deactivated successfully.
Dec 01 09:44:06 compute-0 podman[270332]: 2025-12-01 09:44:06.993886548 +0000 UTC m=+0.989350851 container died 491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_leakey, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:44:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-b089f6b8df3ef98acabe274ed0aa903921773524fbfd3dd8e6dfc23a5a2c04e4-merged.mount: Deactivated successfully.
Dec 01 09:44:07 compute-0 podman[270332]: 2025-12-01 09:44:07.062664285 +0000 UTC m=+1.058128548 container remove 491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_leakey, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:44:07 compute-0 systemd[1]: libpod-conmon-491d33e67080b3e9780da7a3e858480a41693db66ff53218550e271812a5cff9.scope: Deactivated successfully.
Dec 01 09:44:07 compute-0 sudo[270227]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:07 compute-0 sudo[270370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:07 compute-0 sudo[270370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:07 compute-0 sudo[270370]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:07 compute-0 sudo[270395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:44:07 compute-0 sudo[270395]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:07 compute-0 sudo[270395]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:07 compute-0 sudo[270420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:07 compute-0 sudo[270420]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:07 compute-0 sudo[270420]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:07 compute-0 sudo[270445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:44:07 compute-0 sudo[270445]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:07 compute-0 ceph-mon[75031]: pgmap v1007: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:07 compute-0 podman[270509]: 2025-12-01 09:44:07.828132033 +0000 UTC m=+0.050006588 container create 04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_wu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:44:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1008: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:07 compute-0 systemd[1]: Started libpod-conmon-04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098.scope.
Dec 01 09:44:07 compute-0 podman[270509]: 2025-12-01 09:44:07.806723247 +0000 UTC m=+0.028597762 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:44:07 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:44:07 compute-0 podman[270509]: 2025-12-01 09:44:07.926663754 +0000 UTC m=+0.148538309 container init 04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_wu, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 09:44:07 compute-0 podman[270509]: 2025-12-01 09:44:07.934681805 +0000 UTC m=+0.156556340 container start 04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_wu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:44:07 compute-0 podman[270509]: 2025-12-01 09:44:07.940401019 +0000 UTC m=+0.162275554 container attach 04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_wu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:44:07 compute-0 jovial_wu[270525]: 167 167
Dec 01 09:44:07 compute-0 systemd[1]: libpod-04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098.scope: Deactivated successfully.
Dec 01 09:44:07 compute-0 podman[270509]: 2025-12-01 09:44:07.944103085 +0000 UTC m=+0.165977660 container died 04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_wu, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:44:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-e4bea85d536013c5230ebd1672e1188693cd82e4037d86c1cc42b2a6f731878d-merged.mount: Deactivated successfully.
Dec 01 09:44:07 compute-0 podman[270509]: 2025-12-01 09:44:07.985260268 +0000 UTC m=+0.207134783 container remove 04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jovial_wu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:44:08 compute-0 systemd[1]: libpod-conmon-04d9c1cd03eaf2499aea646e72f633e98f421cdf4802348a4eb54569e4437098.scope: Deactivated successfully.
Dec 01 09:44:08 compute-0 podman[270550]: 2025-12-01 09:44:08.193606255 +0000 UTC m=+0.050041099 container create 3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_ptolemy, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:44:08 compute-0 systemd[1]: Started libpod-conmon-3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761.scope.
Dec 01 09:44:08 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:44:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c2e22a2e467daaf7950fec52f49703ecc26274259c002fad14e9110a8d5f84/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:08 compute-0 podman[270550]: 2025-12-01 09:44:08.169802391 +0000 UTC m=+0.026237235 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:44:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c2e22a2e467daaf7950fec52f49703ecc26274259c002fad14e9110a8d5f84/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c2e22a2e467daaf7950fec52f49703ecc26274259c002fad14e9110a8d5f84/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64c2e22a2e467daaf7950fec52f49703ecc26274259c002fad14e9110a8d5f84/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:44:08 compute-0 podman[270550]: 2025-12-01 09:44:08.277088265 +0000 UTC m=+0.133523149 container init 3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_ptolemy, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:44:08 compute-0 podman[270550]: 2025-12-01 09:44:08.287740911 +0000 UTC m=+0.144175755 container start 3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_ptolemy, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:44:08 compute-0 podman[270550]: 2025-12-01 09:44:08.292172098 +0000 UTC m=+0.148606952 container attach 3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_ptolemy, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]: {
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "osd_id": 0,
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "type": "bluestore"
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:     },
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "osd_id": 1,
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "type": "bluestore"
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:     },
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "osd_id": 2,
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:         "type": "bluestore"
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]:     }
Dec 01 09:44:09 compute-0 lucid_ptolemy[270566]: }
Dec 01 09:44:09 compute-0 systemd[1]: libpod-3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761.scope: Deactivated successfully.
Dec 01 09:44:09 compute-0 systemd[1]: libpod-3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761.scope: Consumed 1.021s CPU time.
Dec 01 09:44:09 compute-0 podman[270599]: 2025-12-01 09:44:09.343983065 +0000 UTC m=+0.029325924 container died 3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_ptolemy, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:44:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-64c2e22a2e467daaf7950fec52f49703ecc26274259c002fad14e9110a8d5f84-merged.mount: Deactivated successfully.
Dec 01 09:44:09 compute-0 podman[270599]: 2025-12-01 09:44:09.410212048 +0000 UTC m=+0.095554927 container remove 3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_ptolemy, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:44:09 compute-0 systemd[1]: libpod-conmon-3bd71b516023d86831a53ea6aeff432b7663afb40db94b742a82e702a8f45761.scope: Deactivated successfully.
Dec 01 09:44:09 compute-0 sudo[270445]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:44:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:44:09 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:09 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 01384acc-5317-47b4-9b63-6b1e42ed7b7f does not exist
Dec 01 09:44:09 compute-0 ceph-mon[75031]: pgmap v1008: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:09 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:09 compute-0 sudo[270614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:44:09 compute-0 sudo[270614]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:09 compute-0 sudo[270614]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:09 compute-0 sudo[270639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:44:09 compute-0 sudo[270639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:44:09 compute-0 sudo[270639]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1009: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:44:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 5306 writes, 22K keys, 5306 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5306 writes, 841 syncs, 6.31 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1098 writes, 3074 keys, 1098 commit groups, 1.0 writes per commit group, ingest: 1.84 MB, 0.00 MB/s
                                           Interval WAL: 1098 writes, 472 syncs, 2.33 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 09:44:10 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:44:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:11 compute-0 ceph-mon[75031]: pgmap v1009: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1010: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:13 compute-0 podman[270664]: 2025-12-01 09:44:13.02404234 +0000 UTC m=+0.120096542 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:44:13
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['images', 'volumes', '.mgr', 'cephfs.cephfs.data', 'cephfs.cephfs.meta', 'backups', 'vms']
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:44:13 compute-0 ceph-mon[75031]: pgmap v1010: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1011: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:44:15 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.2 total, 600.0 interval
                                           Cumulative writes: 6021 writes, 24K keys, 6021 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6021 writes, 1127 syncs, 5.34 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1678 writes, 4735 keys, 1678 commit groups, 1.0 writes per commit group, ingest: 2.54 MB, 0.00 MB/s
                                           Interval WAL: 1678 writes, 729 syncs, 2.30 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 09:44:15 compute-0 ceph-mon[75031]: pgmap v1011: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1012: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:17 compute-0 podman[270690]: 2025-12-01 09:44:17.002484021 +0000 UTC m=+0.092127589 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 01 09:44:17 compute-0 ceph-mon[75031]: pgmap v1012: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1013: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:44:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:44:19 compute-0 ceph-mon[75031]: pgmap v1013: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1014: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:44:20.482 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:44:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:44:20.483 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:44:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:44:20.484 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:44:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:21 compute-0 ceph-mon[75031]: pgmap v1014: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1015: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:22 compute-0 ceph-mon[75031]: pgmap v1015: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1016: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:44:24 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.4 total, 600.0 interval
                                           Cumulative writes: 5984 writes, 24K keys, 5984 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5984 writes, 1172 syncs, 5.11 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1822 writes, 4802 keys, 1822 commit groups, 1.0 writes per commit group, ingest: 2.46 MB, 0.00 MB/s
                                           Interval WAL: 1822 writes, 820 syncs, 2.22 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 09:44:25 compute-0 ceph-mon[75031]: pgmap v1016: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1017: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:27 compute-0 ceph-mon[75031]: pgmap v1017: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1018: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:28 compute-0 ceph-mon[75031]: pgmap v1018: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:28 compute-0 ceph-mgr[75324]: [devicehealth INFO root] Check health
Dec 01 09:44:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1019: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:30 compute-0 ceph-mon[75031]: pgmap v1019: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1020: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:31 compute-0 podman[270709]: 2025-12-01 09:44:31.972757855 +0000 UTC m=+0.071372610 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 09:44:32 compute-0 ceph-mon[75031]: pgmap v1020: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1021: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:34 compute-0 ceph-mon[75031]: pgmap v1021: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1022: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:36 compute-0 nova_compute[250706]: 2025-12-01 09:44:36.416 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:36 compute-0 nova_compute[250706]: 2025-12-01 09:44:36.417 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:36 compute-0 ceph-mon[75031]: pgmap v1022: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:37 compute-0 nova_compute[250706]: 2025-12-01 09:44:37.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:37 compute-0 nova_compute[250706]: 2025-12-01 09:44:37.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:44:37 compute-0 nova_compute[250706]: 2025-12-01 09:44:37.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:44:37 compute-0 nova_compute[250706]: 2025-12-01 09:44:37.086 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:44:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1023: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:38 compute-0 nova_compute[250706]: 2025-12-01 09:44:38.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:38 compute-0 nova_compute[250706]: 2025-12-01 09:44:38.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:38 compute-0 nova_compute[250706]: 2025-12-01 09:44:38.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:38 compute-0 nova_compute[250706]: 2025-12-01 09:44:38.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:44:38 compute-0 ceph-mon[75031]: pgmap v1023: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:39 compute-0 nova_compute[250706]: 2025-12-01 09:44:39.048 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:39 compute-0 nova_compute[250706]: 2025-12-01 09:44:39.074 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:39 compute-0 nova_compute[250706]: 2025-12-01 09:44:39.074 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1024: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:40 compute-0 ceph-mon[75031]: pgmap v1024: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1025: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:42 compute-0 ceph-mon[75031]: pgmap v1025: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:44:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:44:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:44:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:44:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:44:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:44:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.083 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.083 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.084 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.084 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.084 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:44:43 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:44:43 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3565937543' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.596 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.806 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.807 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5177MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.808 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.808 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:44:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1026: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.929 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.929 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:44:43 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3565937543' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:44:43 compute-0 nova_compute[250706]: 2025-12-01 09:44:43.946 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:44:44 compute-0 podman[270751]: 2025-12-01 09:44:44.018152086 +0000 UTC m=+0.112271378 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 09:44:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:44:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2157335569' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:44:44 compute-0 nova_compute[250706]: 2025-12-01 09:44:44.390 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:44:44 compute-0 nova_compute[250706]: 2025-12-01 09:44:44.398 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:44:44 compute-0 nova_compute[250706]: 2025-12-01 09:44:44.414 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:44:44 compute-0 nova_compute[250706]: 2025-12-01 09:44:44.417 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:44:44 compute-0 nova_compute[250706]: 2025-12-01 09:44:44.417 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:44:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:44:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3695607166' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:44:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:44:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3695607166' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:44:44 compute-0 ceph-mon[75031]: pgmap v1026: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2157335569' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:44:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/3695607166' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:44:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/3695607166' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:44:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1027: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:45 compute-0 ceph-mon[75031]: pgmap v1027: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1028: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:47 compute-0 podman[270799]: 2025-12-01 09:44:47.994490326 +0000 UTC m=+0.082647346 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 09:44:48 compute-0 ceph-mon[75031]: pgmap v1028: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1029: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:50 compute-0 ceph-mon[75031]: pgmap v1029: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1030: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:52 compute-0 ceph-mon[75031]: pgmap v1030: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1031: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:55 compute-0 ceph-mon[75031]: pgmap v1031: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:44:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1032: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:56 compute-0 ceph-mon[75031]: pgmap v1032: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1033: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:58 compute-0 ceph-mon[75031]: pgmap v1033: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:44:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1034: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:00 compute-0 ceph-mon[75031]: pgmap v1034: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1035: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:02 compute-0 ceph-mon[75031]: pgmap v1035: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:02 compute-0 podman[270819]: 2025-12-01 09:45:02.957729151 +0000 UTC m=+0.065870574 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 01 09:45:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1036: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:04 compute-0 ceph-mon[75031]: pgmap v1036: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1037: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:05 compute-0 ceph-mon[75031]: pgmap v1037: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1038: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:08 compute-0 ceph-mon[75031]: pgmap v1038: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:09 compute-0 sudo[270840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:09 compute-0 sudo[270840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:09 compute-0 sudo[270840]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:09 compute-0 sudo[270865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:45:09 compute-0 sudo[270865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:09 compute-0 sudo[270865]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:09 compute-0 sudo[270890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:09 compute-0 sudo[270890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:09 compute-0 sudo[270890]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1039: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:09 compute-0 sudo[270915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:45:09 compute-0 sudo[270915]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:10 compute-0 sudo[270915]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:10 compute-0 sudo[270970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:10 compute-0 sudo[270970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:10 compute-0 sudo[270970]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:10 compute-0 sudo[270995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:45:10 compute-0 sudo[270995]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:10 compute-0 sudo[270995]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:10 compute-0 sudo[271020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:10 compute-0 sudo[271020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:10 compute-0 sudo[271020]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:10 compute-0 sudo[271045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- inventory --format=json-pretty --filter-for-batch
Dec 01 09:45:10 compute-0 sudo[271045]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:11 compute-0 ceph-mon[75031]: pgmap v1039: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:11 compute-0 podman[271110]: 2025-12-01 09:45:11.168367994 +0000 UTC m=+0.023252739 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:11 compute-0 podman[271110]: 2025-12-01 09:45:11.31019467 +0000 UTC m=+0.165079425 container create d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_taussig, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:45:11 compute-0 systemd[1]: Started libpod-conmon-d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a.scope.
Dec 01 09:45:11 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:11 compute-0 podman[271110]: 2025-12-01 09:45:11.539344125 +0000 UTC m=+0.394228890 container init d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_taussig, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:45:11 compute-0 podman[271110]: 2025-12-01 09:45:11.553963425 +0000 UTC m=+0.408848140 container start d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_taussig, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:45:11 compute-0 podman[271110]: 2025-12-01 09:45:11.5586715 +0000 UTC m=+0.413556265 container attach d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_taussig, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef)
Dec 01 09:45:11 compute-0 sad_taussig[271126]: 167 167
Dec 01 09:45:11 compute-0 systemd[1]: libpod-d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a.scope: Deactivated successfully.
Dec 01 09:45:11 compute-0 podman[271110]: 2025-12-01 09:45:11.56215623 +0000 UTC m=+0.417040965 container died d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_taussig, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:45:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-fedaa50ecb23b227cd8b0f83117d881c869cc181e8d6033fcc4e44972f08122c-merged.mount: Deactivated successfully.
Dec 01 09:45:11 compute-0 podman[271110]: 2025-12-01 09:45:11.615714389 +0000 UTC m=+0.470599104 container remove d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_taussig, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:45:11 compute-0 systemd[1]: libpod-conmon-d07c443c459787c4ed0a4774255779959c232fa8c29672d82c8e496b092d052a.scope: Deactivated successfully.
Dec 01 09:45:11 compute-0 podman[271148]: 2025-12-01 09:45:11.812016 +0000 UTC m=+0.048545346 container create 66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hoover, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:45:11 compute-0 systemd[1]: Started libpod-conmon-66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0.scope.
Dec 01 09:45:11 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fe0e732668db96d48b95feef1bfc06dc1421e0f129deb5f47b1ee9f5d0b0082/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fe0e732668db96d48b95feef1bfc06dc1421e0f129deb5f47b1ee9f5d0b0082/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fe0e732668db96d48b95feef1bfc06dc1421e0f129deb5f47b1ee9f5d0b0082/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:11 compute-0 podman[271148]: 2025-12-01 09:45:11.790329797 +0000 UTC m=+0.026859143 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fe0e732668db96d48b95feef1bfc06dc1421e0f129deb5f47b1ee9f5d0b0082/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1040: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:11 compute-0 podman[271148]: 2025-12-01 09:45:11.893984805 +0000 UTC m=+0.130514131 container init 66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hoover, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:45:11 compute-0 podman[271148]: 2025-12-01 09:45:11.905360522 +0000 UTC m=+0.141889838 container start 66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hoover, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Dec 01 09:45:11 compute-0 podman[271148]: 2025-12-01 09:45:11.909011337 +0000 UTC m=+0.145540673 container attach 66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hoover, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:45:12 compute-0 ceph-mon[75031]: pgmap v1040: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:45:13
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['.mgr', 'backups', 'volumes', 'vms', 'cephfs.cephfs.meta', 'images', 'cephfs.cephfs.data']
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]: [
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:     {
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "available": false,
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "ceph_device": false,
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "lsm_data": {},
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "lvs": [],
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "path": "/dev/sr0",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "rejected_reasons": [
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "Has a FileSystem",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "Insufficient space (<5GB)"
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         ],
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         "sys_api": {
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "actuators": null,
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "device_nodes": "sr0",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "devname": "sr0",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "human_readable_size": "482.00 KB",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "id_bus": "ata",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "model": "QEMU DVD-ROM",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "nr_requests": "2",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "parent": "/dev/sr0",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "partitions": {},
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "path": "/dev/sr0",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "removable": "1",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "rev": "2.5+",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "ro": "0",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "rotational": "1",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "sas_address": "",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "sas_device_handle": "",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "scheduler_mode": "mq-deadline",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "sectors": 0,
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "sectorsize": "2048",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "size": 493568.0,
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "support_discard": "2048",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "type": "disk",
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:             "vendor": "QEMU"
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:         }
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]:     }
Dec 01 09:45:13 compute-0 flamboyant_hoover[271164]: ]
Dec 01 09:45:13 compute-0 systemd[1]: libpod-66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0.scope: Deactivated successfully.
Dec 01 09:45:13 compute-0 podman[271148]: 2025-12-01 09:45:13.38279365 +0000 UTC m=+1.619322986 container died 66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hoover, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:45:13 compute-0 systemd[1]: libpod-66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0.scope: Consumed 1.527s CPU time.
Dec 01 09:45:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-8fe0e732668db96d48b95feef1bfc06dc1421e0f129deb5f47b1ee9f5d0b0082-merged.mount: Deactivated successfully.
Dec 01 09:45:13 compute-0 podman[271148]: 2025-12-01 09:45:13.456663713 +0000 UTC m=+1.693193049 container remove 66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_hoover, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:45:13 compute-0 systemd[1]: libpod-conmon-66bbd1ecac41e0ddd43ab45a6f74fedb7dcb652b6faecdab191b67e760df6cb0.scope: Deactivated successfully.
Dec 01 09:45:13 compute-0 sudo[271045]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 2eeba024-faa1-452e-8b4d-04018c1a90e4 does not exist
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 00cc33a2-123f-4731-955b-9b1bbe7a748d does not exist
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 2397e287-d409-4226-9d90-a50d8ceb761e does not exist
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:45:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:45:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:45:13 compute-0 sudo[273245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:13 compute-0 sudo[273245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:13 compute-0 sudo[273245]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:13 compute-0 sudo[273270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:45:13 compute-0 sudo[273270]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:13 compute-0 sudo[273270]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:13 compute-0 sudo[273295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:13 compute-0 sudo[273295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:13 compute-0 sudo[273295]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:13 compute-0 sudo[273320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:45:13 compute-0 sudo[273320]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1041: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:14 compute-0 podman[273387]: 2025-12-01 09:45:14.209465608 +0000 UTC m=+0.049092722 container create 9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:45:14 compute-0 systemd[1]: Started libpod-conmon-9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508.scope.
Dec 01 09:45:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:14 compute-0 podman[273387]: 2025-12-01 09:45:14.19039955 +0000 UTC m=+0.030026694 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:14 compute-0 podman[273387]: 2025-12-01 09:45:14.303809119 +0000 UTC m=+0.143436313 container init 9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:45:14 compute-0 podman[273387]: 2025-12-01 09:45:14.315209016 +0000 UTC m=+0.154836130 container start 9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:45:14 compute-0 podman[273387]: 2025-12-01 09:45:14.319778298 +0000 UTC m=+0.159405492 container attach 9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:45:14 compute-0 interesting_dubinsky[273405]: 167 167
Dec 01 09:45:14 compute-0 systemd[1]: libpod-9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508.scope: Deactivated successfully.
Dec 01 09:45:14 compute-0 podman[273387]: 2025-12-01 09:45:14.32403843 +0000 UTC m=+0.163665544 container died 9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:45:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-af1d7ecef338eebd2adff7114280dd6ad305dbe3c8b8f8f4699110a59f7cef92-merged.mount: Deactivated successfully.
Dec 01 09:45:14 compute-0 podman[273387]: 2025-12-01 09:45:14.366045397 +0000 UTC m=+0.205672521 container remove 9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:45:14 compute-0 systemd[1]: libpod-conmon-9cce2812f9acf7c8a48b9a428bb8ca34fed97a8785d73d4f8eac8c86db658508.scope: Deactivated successfully.
Dec 01 09:45:14 compute-0 podman[273402]: 2025-12-01 09:45:14.381586434 +0000 UTC m=+0.121669157 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:45:14 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:45:14 compute-0 ceph-mon[75031]: pgmap v1041: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:14 compute-0 podman[273456]: 2025-12-01 09:45:14.550433506 +0000 UTC m=+0.052575432 container create 684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_heyrovsky, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:45:14 compute-0 systemd[1]: Started libpod-conmon-684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9.scope.
Dec 01 09:45:14 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:14 compute-0 podman[273456]: 2025-12-01 09:45:14.530932206 +0000 UTC m=+0.033074162 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14be1cb451823c25111c170c3736a5a01d308fc0c9cbaacdd601a8ad78a95289/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14be1cb451823c25111c170c3736a5a01d308fc0c9cbaacdd601a8ad78a95289/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14be1cb451823c25111c170c3736a5a01d308fc0c9cbaacdd601a8ad78a95289/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14be1cb451823c25111c170c3736a5a01d308fc0c9cbaacdd601a8ad78a95289/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14be1cb451823c25111c170c3736a5a01d308fc0c9cbaacdd601a8ad78a95289/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:14 compute-0 podman[273456]: 2025-12-01 09:45:14.648484984 +0000 UTC m=+0.150626980 container init 684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_heyrovsky, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:45:14 compute-0 podman[273456]: 2025-12-01 09:45:14.656567266 +0000 UTC m=+0.158709192 container start 684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_heyrovsky, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:45:14 compute-0 podman[273456]: 2025-12-01 09:45:14.660068617 +0000 UTC m=+0.162210623 container attach 684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_heyrovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:45:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:15 compute-0 ecstatic_heyrovsky[273473]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:45:15 compute-0 ecstatic_heyrovsky[273473]: --> relative data size: 1.0
Dec 01 09:45:15 compute-0 ecstatic_heyrovsky[273473]: --> All data devices are unavailable
Dec 01 09:45:15 compute-0 systemd[1]: libpod-684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9.scope: Deactivated successfully.
Dec 01 09:45:15 compute-0 systemd[1]: libpod-684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9.scope: Consumed 1.091s CPU time.
Dec 01 09:45:15 compute-0 podman[273502]: 2025-12-01 09:45:15.845708458 +0000 UTC m=+0.026842982 container died 684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_heyrovsky, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:45:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1042: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:16 compute-0 ceph-mon[75031]: pgmap v1042: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-14be1cb451823c25111c170c3736a5a01d308fc0c9cbaacdd601a8ad78a95289-merged.mount: Deactivated successfully.
Dec 01 09:45:16 compute-0 podman[273502]: 2025-12-01 09:45:16.264248346 +0000 UTC m=+0.445382880 container remove 684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_heyrovsky, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef)
Dec 01 09:45:16 compute-0 systemd[1]: libpod-conmon-684b74e5e539467f4e93a0f3c659c35490803e60055c9becdc3a511ae69c0ce9.scope: Deactivated successfully.
Dec 01 09:45:16 compute-0 sudo[273320]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:16 compute-0 sudo[273517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:16 compute-0 sudo[273517]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:16 compute-0 sudo[273517]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:16 compute-0 sudo[273542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:45:16 compute-0 sudo[273542]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:16 compute-0 sudo[273542]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:16 compute-0 sudo[273567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:16 compute-0 sudo[273567]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:16 compute-0 sudo[273567]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:16 compute-0 sudo[273592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:45:16 compute-0 sudo[273592]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:16 compute-0 podman[273656]: 2025-12-01 09:45:16.981341564 +0000 UTC m=+0.066956556 container create a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_merkle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 09:45:17 compute-0 systemd[1]: Started libpod-conmon-a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db.scope.
Dec 01 09:45:17 compute-0 podman[273656]: 2025-12-01 09:45:16.93597334 +0000 UTC m=+0.021588352 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:17 compute-0 podman[273656]: 2025-12-01 09:45:17.100046735 +0000 UTC m=+0.185661757 container init a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_merkle, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:45:17 compute-0 podman[273656]: 2025-12-01 09:45:17.109461585 +0000 UTC m=+0.195076577 container start a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_merkle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:45:17 compute-0 recursing_merkle[273672]: 167 167
Dec 01 09:45:17 compute-0 systemd[1]: libpod-a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db.scope: Deactivated successfully.
Dec 01 09:45:17 compute-0 conmon[273672]: conmon a85ddb46d21059290dc6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db.scope/container/memory.events
Dec 01 09:45:17 compute-0 podman[273656]: 2025-12-01 09:45:17.200645146 +0000 UTC m=+0.286260138 container attach a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_merkle, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Dec 01 09:45:17 compute-0 podman[273656]: 2025-12-01 09:45:17.201659935 +0000 UTC m=+0.287274937 container died a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_merkle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default)
Dec 01 09:45:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-39ebc596f70e61972ef98a0836945b1f3fd0a91c75855419375d447dba0fafa2-merged.mount: Deactivated successfully.
Dec 01 09:45:17 compute-0 podman[273656]: 2025-12-01 09:45:17.399153111 +0000 UTC m=+0.484768103 container remove a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_merkle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:45:17 compute-0 systemd[1]: libpod-conmon-a85ddb46d21059290dc66e1551a59fe20dd0bc1f3d4f8d9bc03934cf70d087db.scope: Deactivated successfully.
Dec 01 09:45:17 compute-0 podman[273697]: 2025-12-01 09:45:17.620546093 +0000 UTC m=+0.060684015 container create be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_golick, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:45:17 compute-0 systemd[1]: Started libpod-conmon-be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0.scope.
Dec 01 09:45:17 compute-0 podman[273697]: 2025-12-01 09:45:17.590079397 +0000 UTC m=+0.030217419 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:17 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa790a395f30430ee15c1d26fcbeb0f0220b237412566ffb24c13a363eda6708/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa790a395f30430ee15c1d26fcbeb0f0220b237412566ffb24c13a363eda6708/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa790a395f30430ee15c1d26fcbeb0f0220b237412566ffb24c13a363eda6708/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa790a395f30430ee15c1d26fcbeb0f0220b237412566ffb24c13a363eda6708/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:17 compute-0 podman[273697]: 2025-12-01 09:45:17.722853663 +0000 UTC m=+0.162991615 container init be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_golick, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:45:17 compute-0 podman[273697]: 2025-12-01 09:45:17.730578055 +0000 UTC m=+0.170716027 container start be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_golick, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:45:17 compute-0 podman[273697]: 2025-12-01 09:45:17.735686642 +0000 UTC m=+0.175824614 container attach be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_golick, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:45:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1043: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:18 compute-0 mystifying_golick[273713]: {
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:     "0": [
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:         {
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "devices": [
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "/dev/loop3"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             ],
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_name": "ceph_lv0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_size": "21470642176",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "name": "ceph_lv0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "tags": {
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cluster_name": "ceph",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.crush_device_class": "",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.encrypted": "0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osd_id": "0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.type": "block",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.vdo": "0"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             },
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "type": "block",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "vg_name": "ceph_vg0"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:         }
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:     ],
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:     "1": [
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:         {
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "devices": [
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "/dev/loop4"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             ],
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_name": "ceph_lv1",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_size": "21470642176",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "name": "ceph_lv1",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "tags": {
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cluster_name": "ceph",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.crush_device_class": "",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.encrypted": "0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osd_id": "1",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.type": "block",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.vdo": "0"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             },
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "type": "block",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "vg_name": "ceph_vg1"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:         }
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:     ],
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:     "2": [
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:         {
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "devices": [
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "/dev/loop5"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             ],
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_name": "ceph_lv2",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_size": "21470642176",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "name": "ceph_lv2",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "tags": {
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.cluster_name": "ceph",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.crush_device_class": "",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.encrypted": "0",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osd_id": "2",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.type": "block",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:                 "ceph.vdo": "0"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             },
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "type": "block",
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:             "vg_name": "ceph_vg2"
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:         }
Dec 01 09:45:18 compute-0 mystifying_golick[273713]:     ]
Dec 01 09:45:18 compute-0 mystifying_golick[273713]: }
Dec 01 09:45:18 compute-0 systemd[1]: libpod-be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0.scope: Deactivated successfully.
Dec 01 09:45:18 compute-0 podman[273697]: 2025-12-01 09:45:18.540318604 +0000 UTC m=+0.980456546 container died be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_golick, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Dec 01 09:45:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-fa790a395f30430ee15c1d26fcbeb0f0220b237412566ffb24c13a363eda6708-merged.mount: Deactivated successfully.
Dec 01 09:45:18 compute-0 podman[273697]: 2025-12-01 09:45:18.614413453 +0000 UTC m=+1.054551395 container remove be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=mystifying_golick, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:45:18 compute-0 systemd[1]: libpod-conmon-be454cd19164fa4a41fe70736bf92606084aed074cb15a4fd68fc73e3abed4f0.scope: Deactivated successfully.
Dec 01 09:45:18 compute-0 sudo[273592]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:18 compute-0 podman[273723]: 2025-12-01 09:45:18.655873725 +0000 UTC m=+0.078306352 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:45:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:45:18 compute-0 sudo[273752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:18 compute-0 sudo[273752]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:18 compute-0 sudo[273752]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:18 compute-0 sudo[273777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:45:18 compute-0 sudo[273777]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:18 compute-0 sudo[273777]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:18 compute-0 sudo[273802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:18 compute-0 sudo[273802]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:18 compute-0 sudo[273802]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:18 compute-0 sudo[273827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:45:18 compute-0 sudo[273827]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:18 compute-0 ceph-mon[75031]: pgmap v1043: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:19 compute-0 podman[273893]: 2025-12-01 09:45:19.253012115 +0000 UTC m=+0.045031485 container create f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcnulty, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:45:19 compute-0 systemd[1]: Started libpod-conmon-f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23.scope.
Dec 01 09:45:19 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:19 compute-0 podman[273893]: 2025-12-01 09:45:19.234998358 +0000 UTC m=+0.027017748 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:19 compute-0 podman[273893]: 2025-12-01 09:45:19.344744831 +0000 UTC m=+0.136764231 container init f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcnulty, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:45:19 compute-0 podman[273893]: 2025-12-01 09:45:19.353274477 +0000 UTC m=+0.145293857 container start f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcnulty, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 01 09:45:19 compute-0 podman[273893]: 2025-12-01 09:45:19.357076976 +0000 UTC m=+0.149096386 container attach f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcnulty, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Dec 01 09:45:19 compute-0 priceless_mcnulty[273909]: 167 167
Dec 01 09:45:19 compute-0 systemd[1]: libpod-f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23.scope: Deactivated successfully.
Dec 01 09:45:19 compute-0 podman[273893]: 2025-12-01 09:45:19.360231086 +0000 UTC m=+0.152250466 container died f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcnulty, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:45:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-e6e0fe34ffedeaa00286e0abeadce8bce76c5526044deeed820f2e2d2d5a505a-merged.mount: Deactivated successfully.
Dec 01 09:45:19 compute-0 podman[273893]: 2025-12-01 09:45:19.40282044 +0000 UTC m=+0.194839810 container remove f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mcnulty, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:45:19 compute-0 systemd[1]: libpod-conmon-f62bd5976eeda710da9c2bf694506cc1af3e0ae7bc97d293defa6bcbc6f7ae23.scope: Deactivated successfully.
Dec 01 09:45:19 compute-0 podman[273934]: 2025-12-01 09:45:19.586245101 +0000 UTC m=+0.045079406 container create 0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_golick, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:45:19 compute-0 systemd[1]: Started libpod-conmon-0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041.scope.
Dec 01 09:45:19 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:45:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6337dc2a2980e91144016f337f0e7b721f73ff1012ccbf7f77bad7d871a5d26/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6337dc2a2980e91144016f337f0e7b721f73ff1012ccbf7f77bad7d871a5d26/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6337dc2a2980e91144016f337f0e7b721f73ff1012ccbf7f77bad7d871a5d26/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6337dc2a2980e91144016f337f0e7b721f73ff1012ccbf7f77bad7d871a5d26/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:45:19 compute-0 podman[273934]: 2025-12-01 09:45:19.565078033 +0000 UTC m=+0.023912358 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:45:19 compute-0 podman[273934]: 2025-12-01 09:45:19.668966599 +0000 UTC m=+0.127800934 container init 0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_golick, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:45:19 compute-0 podman[273934]: 2025-12-01 09:45:19.675167757 +0000 UTC m=+0.134002072 container start 0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_golick, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:45:19 compute-0 podman[273934]: 2025-12-01 09:45:19.678349468 +0000 UTC m=+0.137183783 container attach 0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_golick, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:45:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1044: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:45:20.483 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:45:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:45:20.484 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:45:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:45:20.485 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:45:20 compute-0 condescending_golick[273950]: {
Dec 01 09:45:20 compute-0 condescending_golick[273950]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "osd_id": 0,
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "type": "bluestore"
Dec 01 09:45:20 compute-0 condescending_golick[273950]:     },
Dec 01 09:45:20 compute-0 condescending_golick[273950]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "osd_id": 1,
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "type": "bluestore"
Dec 01 09:45:20 compute-0 condescending_golick[273950]:     },
Dec 01 09:45:20 compute-0 condescending_golick[273950]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "osd_id": 2,
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:45:20 compute-0 condescending_golick[273950]:         "type": "bluestore"
Dec 01 09:45:20 compute-0 condescending_golick[273950]:     }
Dec 01 09:45:20 compute-0 condescending_golick[273950]: }
Dec 01 09:45:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:20 compute-0 systemd[1]: libpod-0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041.scope: Deactivated successfully.
Dec 01 09:45:20 compute-0 podman[273934]: 2025-12-01 09:45:20.722068082 +0000 UTC m=+1.180902477 container died 0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_golick, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:45:20 compute-0 systemd[1]: libpod-0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041.scope: Consumed 1.053s CPU time.
Dec 01 09:45:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-f6337dc2a2980e91144016f337f0e7b721f73ff1012ccbf7f77bad7d871a5d26-merged.mount: Deactivated successfully.
Dec 01 09:45:20 compute-0 podman[273934]: 2025-12-01 09:45:20.785874936 +0000 UTC m=+1.244709251 container remove 0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_golick, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Dec 01 09:45:20 compute-0 systemd[1]: libpod-conmon-0258eac4f665c6a7f68e4cf07403abbbe6c273458b39c87088867e83b025b041.scope: Deactivated successfully.
Dec 01 09:45:20 compute-0 sudo[273827]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:45:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:45:20 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:20 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 49b6264d-e328-428f-98c2-840aa4c88a19 does not exist
Dec 01 09:45:20 compute-0 sudo[273996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:45:20 compute-0 sudo[273996]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:20 compute-0 sudo[273996]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:20 compute-0 ceph-mon[75031]: pgmap v1044: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:20 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:20 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:45:20 compute-0 sudo[274021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:45:20 compute-0 sudo[274021]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:45:21 compute-0 sudo[274021]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1045: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:22 compute-0 ceph-mon[75031]: pgmap v1045: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1046: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:24 compute-0 ceph-mon[75031]: pgmap v1046: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1047: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:26 compute-0 ceph-mon[75031]: pgmap v1047: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1048: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:27 compute-0 ceph-mon[75031]: pgmap v1048: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1049: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:30 compute-0 ceph-mon[75031]: pgmap v1049: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1050: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:32 compute-0 ceph-mon[75031]: pgmap v1050: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1051: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:34 compute-0 podman[274046]: 2025-12-01 09:45:34.008209482 +0000 UTC m=+0.096719690 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 09:45:35 compute-0 ceph-mon[75031]: pgmap v1051: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1052: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:36 compute-0 ceph-mon[75031]: pgmap v1052: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:37 compute-0 nova_compute[250706]: 2025-12-01 09:45:37.414 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:37 compute-0 nova_compute[250706]: 2025-12-01 09:45:37.415 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:37 compute-0 nova_compute[250706]: 2025-12-01 09:45:37.415 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:45:37 compute-0 nova_compute[250706]: 2025-12-01 09:45:37.415 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:45:37 compute-0 nova_compute[250706]: 2025-12-01 09:45:37.431 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:45:37 compute-0 nova_compute[250706]: 2025-12-01 09:45:37.432 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1053: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:38 compute-0 nova_compute[250706]: 2025-12-01 09:45:38.051 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:38 compute-0 ceph-mon[75031]: pgmap v1053: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:39 compute-0 nova_compute[250706]: 2025-12-01 09:45:39.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:39 compute-0 nova_compute[250706]: 2025-12-01 09:45:39.052 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:45:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1054: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:40 compute-0 nova_compute[250706]: 2025-12-01 09:45:40.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:40 compute-0 ceph-mon[75031]: pgmap v1054: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:41 compute-0 nova_compute[250706]: 2025-12-01 09:45:41.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:41 compute-0 nova_compute[250706]: 2025-12-01 09:45:41.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1055: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:42 compute-0 ceph-mon[75031]: pgmap v1055: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:45:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:45:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:45:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:45:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:45:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:45:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1056: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.092 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.093 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.093 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.093 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.094 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:45:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:45:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1837981429' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:45:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:45:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1837981429' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:45:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:45:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3573441807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.634 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.826 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.828 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5165MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.828 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.828 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.946 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.947 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:45:44 compute-0 ceph-mon[75031]: pgmap v1056: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1837981429' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:45:44 compute-0 nova_compute[250706]: 2025-12-01 09:45:44.968 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:45:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1837981429' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:45:44 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3573441807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:45:45 compute-0 podman[274091]: 2025-12-01 09:45:45.029802174 +0000 UTC m=+0.125801577 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 09:45:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:45:45 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2279055531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:45:45 compute-0 nova_compute[250706]: 2025-12-01 09:45:45.410 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:45:45 compute-0 nova_compute[250706]: 2025-12-01 09:45:45.419 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:45:45 compute-0 nova_compute[250706]: 2025-12-01 09:45:45.445 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:45:45 compute-0 nova_compute[250706]: 2025-12-01 09:45:45.447 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:45:45 compute-0 nova_compute[250706]: 2025-12-01 09:45:45.447 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:45:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1057: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2279055531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:45:45 compute-0 ceph-mon[75031]: pgmap v1057: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1058: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:48 compute-0 ceph-mon[75031]: pgmap v1058: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:48 compute-0 podman[274139]: 2025-12-01 09:45:48.969908612 +0000 UTC m=+0.062358153 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 09:45:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1059: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:50 compute-0 ceph-mon[75031]: pgmap v1059: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1060: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:52 compute-0 ceph-mon[75031]: pgmap v1060: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1061: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:54 compute-0 ceph-mon[75031]: pgmap v1061: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:45:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1062: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:56 compute-0 ceph-mon[75031]: pgmap v1062: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1063: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:58 compute-0 ceph-mon[75031]: pgmap v1063: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:45:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1064: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:00 compute-0 ceph-mon[75031]: pgmap v1064: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1065: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:02 compute-0 ceph-mon[75031]: pgmap v1065: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1066: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:04 compute-0 ceph-mon[75031]: pgmap v1066: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:04 compute-0 podman[274159]: 2025-12-01 09:46:04.972899156 +0000 UTC m=+0.068381926 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Dec 01 09:46:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1067: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:06 compute-0 ceph-mon[75031]: pgmap v1067: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1068: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:08 compute-0 ceph-mon[75031]: pgmap v1068: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1069: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:10 compute-0 ceph-mon[75031]: pgmap v1069: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1070: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:12 compute-0 ceph-mon[75031]: pgmap v1070: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:46:13
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', '.mgr', 'images', 'vms', 'volumes', 'cephfs.cephfs.data']
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:46:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1071: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:13 compute-0 ceph-mon[75031]: pgmap v1071: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1072: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:15 compute-0 podman[274180]: 2025-12-01 09:46:15.996498844 +0000 UTC m=+0.093644302 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 09:46:16 compute-0 ceph-mon[75031]: pgmap v1072: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1073: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:46:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:46:18 compute-0 ceph-mon[75031]: pgmap v1073: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1074: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:19 compute-0 podman[274207]: 2025-12-01 09:46:19.968698454 +0000 UTC m=+0.073155363 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:46:19 compute-0 ceph-mon[75031]: pgmap v1074: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:46:20.483 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:46:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:46:20.484 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:46:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:46:20.484 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:46:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:21 compute-0 sudo[274226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:21 compute-0 sudo[274226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:21 compute-0 sudo[274226]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:21 compute-0 sudo[274251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:46:21 compute-0 sudo[274251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:21 compute-0 sudo[274251]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:21 compute-0 sudo[274276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:21 compute-0 sudo[274276]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:21 compute-0 sudo[274276]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:21 compute-0 sudo[274301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:46:21 compute-0 sudo[274301]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:21 compute-0 sudo[274301]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:46:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:46:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:46:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:46:21 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev a8efb053-65f0-4407-8bed-a90790a1408e does not exist
Dec 01 09:46:21 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 17043d58-8d34-4720-8e12-5867103af4b7 does not exist
Dec 01 09:46:21 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 3578bc76-3892-4a8f-8053-cad4458d1ba5 does not exist
Dec 01 09:46:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:46:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:46:21 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:46:21 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1075: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:46:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:46:21 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:46:21 compute-0 sudo[274357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:21 compute-0 sudo[274357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:21 compute-0 sudo[274357]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:22 compute-0 sudo[274382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:46:22 compute-0 sudo[274382]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:22 compute-0 sudo[274382]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:22 compute-0 sudo[274407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:22 compute-0 sudo[274407]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:22 compute-0 sudo[274407]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:22 compute-0 sudo[274432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:46:22 compute-0 sudo[274432]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:22 compute-0 podman[274497]: 2025-12-01 09:46:22.587821671 +0000 UTC m=+0.042856312 container create ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 09:46:22 compute-0 systemd[1]: Started libpod-conmon-ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3.scope.
Dec 01 09:46:22 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:46:22 compute-0 podman[274497]: 2025-12-01 09:46:22.568365972 +0000 UTC m=+0.023400623 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:46:22 compute-0 podman[274497]: 2025-12-01 09:46:22.675583864 +0000 UTC m=+0.130618575 container init ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:46:22 compute-0 podman[274497]: 2025-12-01 09:46:22.688584497 +0000 UTC m=+0.143619168 container start ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 01 09:46:22 compute-0 podman[274497]: 2025-12-01 09:46:22.692407947 +0000 UTC m=+0.147442628 container attach ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 09:46:22 compute-0 naughty_aryabhata[274514]: 167 167
Dec 01 09:46:22 compute-0 systemd[1]: libpod-ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3.scope: Deactivated successfully.
Dec 01 09:46:22 compute-0 podman[274497]: 2025-12-01 09:46:22.695730393 +0000 UTC m=+0.150765064 container died ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:46:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-8f2e23ae3fbd68c6084ca79149514a1c4a1183482e88fc4d21f069c5f55e1d3b-merged.mount: Deactivated successfully.
Dec 01 09:46:22 compute-0 podman[274497]: 2025-12-01 09:46:22.747867721 +0000 UTC m=+0.202902352 container remove ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:46:22 compute-0 systemd[1]: libpod-conmon-ec2cbcc56247bea35c20bfdd27a69ca0f63e883a8a41d1c5690f986f1b290cc3.scope: Deactivated successfully.
Dec 01 09:46:22 compute-0 ceph-mon[75031]: pgmap v1075: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:23 compute-0 podman[274539]: 2025-12-01 09:46:23.018451 +0000 UTC m=+0.108493802 container create 6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_pike, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:46:23 compute-0 systemd[1]: Started libpod-conmon-6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb.scope.
Dec 01 09:46:23 compute-0 podman[274539]: 2025-12-01 09:46:22.999850725 +0000 UTC m=+0.089893547 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:46:23 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:46:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f88ef32d829289a528e5dfd64ca40c3679267e8f85cd628a11b7a4560e290a4e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f88ef32d829289a528e5dfd64ca40c3679267e8f85cd628a11b7a4560e290a4e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f88ef32d829289a528e5dfd64ca40c3679267e8f85cd628a11b7a4560e290a4e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f88ef32d829289a528e5dfd64ca40c3679267e8f85cd628a11b7a4560e290a4e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f88ef32d829289a528e5dfd64ca40c3679267e8f85cd628a11b7a4560e290a4e/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:23 compute-0 podman[274539]: 2025-12-01 09:46:23.129131239 +0000 UTC m=+0.219174051 container init 6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_pike, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:46:23 compute-0 podman[274539]: 2025-12-01 09:46:23.137923602 +0000 UTC m=+0.227966404 container start 6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_pike, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:46:23 compute-0 podman[274539]: 2025-12-01 09:46:23.141086033 +0000 UTC m=+0.231128845 container attach 6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_pike, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:46:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1076: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:23 compute-0 ceph-mon[75031]: pgmap v1076: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:24 compute-0 elated_pike[274555]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:46:24 compute-0 elated_pike[274555]: --> relative data size: 1.0
Dec 01 09:46:24 compute-0 elated_pike[274555]: --> All data devices are unavailable
Dec 01 09:46:24 compute-0 systemd[1]: libpod-6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb.scope: Deactivated successfully.
Dec 01 09:46:24 compute-0 systemd[1]: libpod-6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb.scope: Consumed 1.128s CPU time.
Dec 01 09:46:24 compute-0 podman[274584]: 2025-12-01 09:46:24.344461252 +0000 UTC m=+0.027866662 container died 6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_pike, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:46:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-f88ef32d829289a528e5dfd64ca40c3679267e8f85cd628a11b7a4560e290a4e-merged.mount: Deactivated successfully.
Dec 01 09:46:24 compute-0 podman[274584]: 2025-12-01 09:46:24.406585248 +0000 UTC m=+0.089990658 container remove 6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_pike, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:46:24 compute-0 systemd[1]: libpod-conmon-6895a3af487a7320fa33febba0ef27a9cd235bbbef0fe89b67e9576b815c32bb.scope: Deactivated successfully.
Dec 01 09:46:24 compute-0 sudo[274432]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:24 compute-0 sudo[274599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:24 compute-0 sudo[274599]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:24 compute-0 sudo[274599]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:24 compute-0 sudo[274624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:46:24 compute-0 sudo[274624]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:24 compute-0 sudo[274624]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:24 compute-0 sudo[274649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:24 compute-0 sudo[274649]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:24 compute-0 sudo[274649]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:24 compute-0 sudo[274674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:46:24 compute-0 sudo[274674]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:25 compute-0 podman[274740]: 2025-12-01 09:46:25.180160628 +0000 UTC m=+0.048897986 container create afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_pike, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:46:25 compute-0 systemd[1]: Started libpod-conmon-afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca.scope.
Dec 01 09:46:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:46:25 compute-0 podman[274740]: 2025-12-01 09:46:25.162099019 +0000 UTC m=+0.030836387 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:46:25 compute-0 podman[274740]: 2025-12-01 09:46:25.281420598 +0000 UTC m=+0.150157986 container init afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_pike, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:46:25 compute-0 podman[274740]: 2025-12-01 09:46:25.292922799 +0000 UTC m=+0.161660197 container start afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_pike, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Dec 01 09:46:25 compute-0 admiring_pike[274756]: 167 167
Dec 01 09:46:25 compute-0 podman[274740]: 2025-12-01 09:46:25.298126578 +0000 UTC m=+0.166863936 container attach afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_pike, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Dec 01 09:46:25 compute-0 systemd[1]: libpod-afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca.scope: Deactivated successfully.
Dec 01 09:46:25 compute-0 podman[274740]: 2025-12-01 09:46:25.301234868 +0000 UTC m=+0.169972226 container died afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_pike, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Dec 01 09:46:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-5d90adf5c0c3e642c2c3b1e08887c89e8adc5123229cd1d569a7c0b4fce1fb84-merged.mount: Deactivated successfully.
Dec 01 09:46:25 compute-0 podman[274740]: 2025-12-01 09:46:25.342546375 +0000 UTC m=+0.211283743 container remove afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_pike, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:46:25 compute-0 systemd[1]: libpod-conmon-afa0a49bf8dfc587ca00bfcfc35a15c18c841a7131ba6de220588f437c90c1ca.scope: Deactivated successfully.
Dec 01 09:46:25 compute-0 podman[274780]: 2025-12-01 09:46:25.526254264 +0000 UTC m=+0.043618824 container create 8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_sutherland, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:46:25 compute-0 systemd[1]: Started libpod-conmon-8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da.scope.
Dec 01 09:46:25 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:46:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffcbed35551e2802754b9a46fb0b1b3d089b0b3d4b86cef3c9dc6a840628dcc7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffcbed35551e2802754b9a46fb0b1b3d089b0b3d4b86cef3c9dc6a840628dcc7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffcbed35551e2802754b9a46fb0b1b3d089b0b3d4b86cef3c9dc6a840628dcc7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffcbed35551e2802754b9a46fb0b1b3d089b0b3d4b86cef3c9dc6a840628dcc7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:25 compute-0 podman[274780]: 2025-12-01 09:46:25.506560168 +0000 UTC m=+0.023924748 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:46:25 compute-0 podman[274780]: 2025-12-01 09:46:25.618941828 +0000 UTC m=+0.136306408 container init 8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_sutherland, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:46:25 compute-0 podman[274780]: 2025-12-01 09:46:25.625132686 +0000 UTC m=+0.142497246 container start 8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_sutherland, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec 01 09:46:25 compute-0 podman[274780]: 2025-12-01 09:46:25.62877432 +0000 UTC m=+0.146138880 container attach 8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_sutherland, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Dec 01 09:46:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1077: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]: {
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:     "0": [
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:         {
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "devices": [
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "/dev/loop3"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             ],
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_name": "ceph_lv0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_size": "21470642176",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "name": "ceph_lv0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "tags": {
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cluster_name": "ceph",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.crush_device_class": "",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.encrypted": "0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osd_id": "0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.type": "block",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.vdo": "0"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             },
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "type": "block",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "vg_name": "ceph_vg0"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:         }
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:     ],
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:     "1": [
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:         {
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "devices": [
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "/dev/loop4"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             ],
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_name": "ceph_lv1",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_size": "21470642176",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "name": "ceph_lv1",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "tags": {
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cluster_name": "ceph",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.crush_device_class": "",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.encrypted": "0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osd_id": "1",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.type": "block",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.vdo": "0"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             },
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "type": "block",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "vg_name": "ceph_vg1"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:         }
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:     ],
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:     "2": [
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:         {
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "devices": [
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "/dev/loop5"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             ],
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_name": "ceph_lv2",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_size": "21470642176",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "name": "ceph_lv2",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "tags": {
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.cluster_name": "ceph",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.crush_device_class": "",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.encrypted": "0",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osd_id": "2",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.type": "block",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:                 "ceph.vdo": "0"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             },
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "type": "block",
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:             "vg_name": "ceph_vg2"
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:         }
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]:     ]
Dec 01 09:46:26 compute-0 lucid_sutherland[274796]: }
Dec 01 09:46:26 compute-0 systemd[1]: libpod-8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da.scope: Deactivated successfully.
Dec 01 09:46:26 compute-0 podman[274780]: 2025-12-01 09:46:26.381301866 +0000 UTC m=+0.898666446 container died 8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_sutherland, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:46:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-ffcbed35551e2802754b9a46fb0b1b3d089b0b3d4b86cef3c9dc6a840628dcc7-merged.mount: Deactivated successfully.
Dec 01 09:46:26 compute-0 podman[274780]: 2025-12-01 09:46:26.435219956 +0000 UTC m=+0.952584526 container remove 8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_sutherland, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 09:46:26 compute-0 systemd[1]: libpod-conmon-8db40c41ad2bb698e8d0657fa89eddf12031cc7fb0c34ac4b151a2362dde22da.scope: Deactivated successfully.
Dec 01 09:46:26 compute-0 sudo[274674]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:26 compute-0 sudo[274817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:26 compute-0 sudo[274817]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:26 compute-0 sudo[274817]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:26 compute-0 sudo[274842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:46:26 compute-0 sudo[274842]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:26 compute-0 sudo[274842]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:26 compute-0 sudo[274867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:26 compute-0 sudo[274867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:26 compute-0 sudo[274867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:26 compute-0 sudo[274892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:46:26 compute-0 sudo[274892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:26 compute-0 ceph-mon[75031]: pgmap v1077: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:27 compute-0 podman[274957]: 2025-12-01 09:46:27.072440047 +0000 UTC m=+0.043927633 container create 787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_wright, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:46:27 compute-0 systemd[1]: Started libpod-conmon-787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c.scope.
Dec 01 09:46:27 compute-0 podman[274957]: 2025-12-01 09:46:27.055631854 +0000 UTC m=+0.027119470 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:46:27 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:46:27 compute-0 podman[274957]: 2025-12-01 09:46:27.167427507 +0000 UTC m=+0.138915123 container init 787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_wright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:46:27 compute-0 podman[274957]: 2025-12-01 09:46:27.175283303 +0000 UTC m=+0.146770899 container start 787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_wright, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:46:27 compute-0 podman[274957]: 2025-12-01 09:46:27.178611868 +0000 UTC m=+0.150099504 container attach 787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_wright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 09:46:27 compute-0 recursing_wright[274974]: 167 167
Dec 01 09:46:27 compute-0 systemd[1]: libpod-787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c.scope: Deactivated successfully.
Dec 01 09:46:27 compute-0 podman[274957]: 2025-12-01 09:46:27.180199004 +0000 UTC m=+0.151686610 container died 787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_wright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Dec 01 09:46:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-c56a30bbcfab24098c95e3107966d18820ef76f0778cadc47037c25c7797e3bf-merged.mount: Deactivated successfully.
Dec 01 09:46:27 compute-0 podman[274957]: 2025-12-01 09:46:27.213663575 +0000 UTC m=+0.185151171 container remove 787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_wright, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:46:27 compute-0 systemd[1]: libpod-conmon-787c4e5aa7f16dbdab360a39604843913f4249b19e61146b9a18f6e96c2dc68c.scope: Deactivated successfully.
Dec 01 09:46:27 compute-0 podman[274999]: 2025-12-01 09:46:27.356706236 +0000 UTC m=+0.021274492 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:46:27 compute-0 podman[274999]: 2025-12-01 09:46:27.545226824 +0000 UTC m=+0.209795060 container create 8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_lehmann, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Dec 01 09:46:27 compute-0 systemd[1]: Started libpod-conmon-8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6.scope.
Dec 01 09:46:27 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:46:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3486d10b20e4fafcaa512d8ef83a34598f644954de59a547cb2692732ed02a33/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3486d10b20e4fafcaa512d8ef83a34598f644954de59a547cb2692732ed02a33/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3486d10b20e4fafcaa512d8ef83a34598f644954de59a547cb2692732ed02a33/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3486d10b20e4fafcaa512d8ef83a34598f644954de59a547cb2692732ed02a33/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:46:27 compute-0 podman[274999]: 2025-12-01 09:46:27.652053704 +0000 UTC m=+0.316621970 container init 8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_lehmann, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:46:27 compute-0 podman[274999]: 2025-12-01 09:46:27.660096865 +0000 UTC m=+0.324665101 container start 8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_lehmann, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507)
Dec 01 09:46:27 compute-0 podman[274999]: 2025-12-01 09:46:27.664276955 +0000 UTC m=+0.328845191 container attach 8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_lehmann, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:46:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1078: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:28 compute-0 sad_lehmann[275017]: {
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "osd_id": 0,
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "type": "bluestore"
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:     },
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "osd_id": 1,
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "type": "bluestore"
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:     },
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "osd_id": 2,
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:         "type": "bluestore"
Dec 01 09:46:28 compute-0 sad_lehmann[275017]:     }
Dec 01 09:46:28 compute-0 sad_lehmann[275017]: }
Dec 01 09:46:28 compute-0 systemd[1]: libpod-8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6.scope: Deactivated successfully.
Dec 01 09:46:28 compute-0 systemd[1]: libpod-8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6.scope: Consumed 1.144s CPU time.
Dec 01 09:46:28 compute-0 podman[274999]: 2025-12-01 09:46:28.793739243 +0000 UTC m=+1.458307469 container died 8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:46:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-3486d10b20e4fafcaa512d8ef83a34598f644954de59a547cb2692732ed02a33-merged.mount: Deactivated successfully.
Dec 01 09:46:28 compute-0 podman[274999]: 2025-12-01 09:46:28.851932065 +0000 UTC m=+1.516500311 container remove 8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:46:28 compute-0 systemd[1]: libpod-conmon-8d0d840de68eb4ae6742fba854e3f418e290d9884a42311788bc8ae45971bde6.scope: Deactivated successfully.
Dec 01 09:46:28 compute-0 sudo[274892]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:46:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:46:28 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:46:28 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:46:28 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 6bf22ed4-4dfe-46bc-9da4-8204a6737216 does not exist
Dec 01 09:46:28 compute-0 sudo[275062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:46:28 compute-0 sudo[275062]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:28 compute-0 sudo[275062]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:28 compute-0 ceph-mon[75031]: pgmap v1078: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:46:28 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:46:29 compute-0 sudo[275087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:46:29 compute-0 sudo[275087]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:46:29 compute-0 sudo[275087]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1079: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:30 compute-0 ceph-mon[75031]: pgmap v1079: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1080: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:32 compute-0 ceph-mon[75031]: pgmap v1080: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1081: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:34 compute-0 ceph-mon[75031]: pgmap v1081: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1082: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:35 compute-0 podman[275112]: 2025-12-01 09:46:35.998135819 +0000 UTC m=+0.096043812 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 09:46:36 compute-0 ceph-mon[75031]: pgmap v1082: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1083: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:38 compute-0 ceph-mon[75031]: pgmap v1083: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:39 compute-0 nova_compute[250706]: 2025-12-01 09:46:39.444 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:39 compute-0 nova_compute[250706]: 2025-12-01 09:46:39.444 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:39 compute-0 nova_compute[250706]: 2025-12-01 09:46:39.445 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:46:39 compute-0 nova_compute[250706]: 2025-12-01 09:46:39.445 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:46:39 compute-0 nova_compute[250706]: 2025-12-01 09:46:39.468 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:46:39 compute-0 nova_compute[250706]: 2025-12-01 09:46:39.469 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1084: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:40 compute-0 nova_compute[250706]: 2025-12-01 09:46:40.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:40 compute-0 nova_compute[250706]: 2025-12-01 09:46:40.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:40 compute-0 nova_compute[250706]: 2025-12-01 09:46:40.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:40 compute-0 nova_compute[250706]: 2025-12-01 09:46:40.053 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:46:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:40 compute-0 ceph-mon[75031]: pgmap v1084: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1085: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:42 compute-0 nova_compute[250706]: 2025-12-01 09:46:42.049 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:42 compute-0 nova_compute[250706]: 2025-12-01 09:46:42.067 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:42 compute-0 nova_compute[250706]: 2025-12-01 09:46:42.067 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:42 compute-0 ceph-mon[75031]: pgmap v1085: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:46:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:46:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:46:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:46:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:46:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:46:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1086: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:44 compute-0 ceph-mon[75031]: pgmap v1086: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:46:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1271799679' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:46:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:46:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1271799679' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:46:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1271799679' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:46:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1271799679' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:46:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1087: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:46:46 compute-0 ceph-mon[75031]: pgmap v1087: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.090 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.090 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.091 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.091 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.091 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:46:46 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:46:46 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2485506119' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.523 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.745 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.748 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5129MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.749 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.749 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.829 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.830 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:46:46 compute-0 nova_compute[250706]: 2025-12-01 09:46:46.875 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:46:47 compute-0 podman[275156]: 2025-12-01 09:46:47.042078354 +0000 UTC m=+0.127288640 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 01 09:46:47 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2485506119' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:46:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:46:47 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3575841280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:46:47 compute-0 nova_compute[250706]: 2025-12-01 09:46:47.367 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:46:47 compute-0 nova_compute[250706]: 2025-12-01 09:46:47.373 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:46:47 compute-0 nova_compute[250706]: 2025-12-01 09:46:47.390 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:46:47 compute-0 nova_compute[250706]: 2025-12-01 09:46:47.391 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:46:47 compute-0 nova_compute[250706]: 2025-12-01 09:46:47.392 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:46:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1088: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:48 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3575841280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:46:48 compute-0 ceph-mon[75031]: pgmap v1088: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1089: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:50 compute-0 podman[275204]: 2025-12-01 09:46:50.979701341 +0000 UTC m=+0.081540564 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 01 09:46:50 compute-0 ceph-mon[75031]: pgmap v1089: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:51 compute-0 rsyslogd[1007]: imjournal: 15766 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 01 09:46:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1090: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:52 compute-0 ceph-mon[75031]: pgmap v1090: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1091: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:55 compute-0 ceph-mon[75031]: pgmap v1091: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:46:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1092: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #51. Immutable memtables: 0.
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.063059) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 51
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416063108, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2053, "num_deletes": 251, "total_data_size": 2360806, "memory_usage": 2416272, "flush_reason": "Manual Compaction"}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #52: started
Dec 01 09:46:56 compute-0 ceph-mon[75031]: pgmap v1092: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416084674, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 52, "file_size": 2278801, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20558, "largest_seqno": 22610, "table_properties": {"data_size": 2269563, "index_size": 5796, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18529, "raw_average_key_size": 19, "raw_value_size": 2251041, "raw_average_value_size": 2423, "num_data_blocks": 266, "num_entries": 929, "num_filter_entries": 929, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582189, "oldest_key_time": 1764582189, "file_creation_time": 1764582416, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 52, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 21699 microseconds, and 11371 cpu microseconds.
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.084751) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #52: 2278801 bytes OK
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.084787) [db/memtable_list.cc:519] [default] Level-0 commit table #52 started
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.086843) [db/memtable_list.cc:722] [default] Level-0 commit table #52: memtable #1 done
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.086867) EVENT_LOG_v1 {"time_micros": 1764582416086859, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.086891) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2352208, prev total WAL file size 2352208, number of live WAL files 2.
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000048.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.088609) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [52(2225KB)], [50(5587KB)]
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416088698, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [52], "files_L6": [50], "score": -1, "input_data_size": 8000126, "oldest_snapshot_seqno": -1}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #53: 4423 keys, 6764644 bytes, temperature: kUnknown
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416147635, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 53, "file_size": 6764644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 6732018, "index_size": 20484, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11077, "raw_key_size": 106061, "raw_average_key_size": 23, "raw_value_size": 6649479, "raw_average_value_size": 1503, "num_data_blocks": 873, "num_entries": 4423, "num_filter_entries": 4423, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764582416, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.147990) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 6764644 bytes
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.149860) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.5 rd, 114.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 5.5 +0.0 blob) out(6.5 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 4937, records dropped: 514 output_compression: NoCompression
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.149890) EVENT_LOG_v1 {"time_micros": 1764582416149876, "job": 26, "event": "compaction_finished", "compaction_time_micros": 59036, "compaction_time_cpu_micros": 34716, "output_level": 6, "num_output_files": 1, "total_output_size": 6764644, "num_input_records": 4937, "num_output_records": 4423, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000052.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416150966, "job": 26, "event": "table_file_deletion", "file_number": 52}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582416153379, "job": 26, "event": "table_file_deletion", "file_number": 50}
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.088340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:46:56 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:46:56.153540) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:46:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1093: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:58 compute-0 ceph-mon[75031]: pgmap v1093: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:46:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1094: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:01 compute-0 ceph-mon[75031]: pgmap v1094: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1095: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:02 compute-0 ceph-mon[75031]: pgmap v1095: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1096: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:04 compute-0 ceph-mon[75031]: pgmap v1096: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1097: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:06 compute-0 ceph-mon[75031]: pgmap v1097: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:06 compute-0 podman[275223]: 2025-12-01 09:47:06.991248429 +0000 UTC m=+0.084018516 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:47:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1098: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:08 compute-0 ceph-mon[75031]: pgmap v1098: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1099: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:11 compute-0 ceph-mon[75031]: pgmap v1099: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1100: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:12 compute-0 ceph-mon[75031]: pgmap v1100: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:47:13
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['backups', 'images', 'volumes', 'cephfs.cephfs.meta', 'vms', 'cephfs.cephfs.data', '.mgr']
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:47:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1101: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:14 compute-0 ceph-mon[75031]: pgmap v1101: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1102: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:17 compute-0 ceph-mon[75031]: pgmap v1102: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:17 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1103: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:18 compute-0 ceph-mon[75031]: pgmap v1103: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:18 compute-0 podman[275243]: 2025-12-01 09:47:18.01250418 +0000 UTC m=+0.112590166 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] _maybe_adjust
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:47:18 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:47:19 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1104: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:47:20.485 159899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:47:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:47:20.485 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:47:20 compute-0 ovn_metadata_agent[159893]: 2025-12-01 09:47:20.485 159899 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:47:20 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:20 compute-0 ceph-mon[75031]: pgmap v1104: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:21 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1105: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:21 compute-0 podman[275271]: 2025-12-01 09:47:21.957589972 +0000 UTC m=+0.059808620 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 01 09:47:23 compute-0 ceph-mon[75031]: pgmap v1105: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:23 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1106: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:24 compute-0 ceph-mon[75031]: pgmap v1106: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:25 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:25 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1107: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:27 compute-0 ceph-mon[75031]: pgmap v1107: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:27 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1108: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:29 compute-0 ceph-mon[75031]: pgmap v1108: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:29 compute-0 sudo[275290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:29 compute-0 sudo[275290]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:29 compute-0 sudo[275290]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:29 compute-0 sudo[275315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:47:29 compute-0 sudo[275315]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:29 compute-0 sudo[275315]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:29 compute-0 sudo[275340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:29 compute-0 sudo[275340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:29 compute-0 sudo[275340]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:29 compute-0 sudo[275365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Dec 01 09:47:29 compute-0 sudo[275365]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:29 compute-0 podman[275462]: 2025-12-01 09:47:29.911500786 +0000 UTC m=+0.068412787 container exec a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Dec 01 09:47:29 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1109: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:30 compute-0 ceph-mon[75031]: pgmap v1109: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:30 compute-0 podman[275462]: 2025-12-01 09:47:30.028766776 +0000 UTC m=+0.185678767 container exec_died a46df485ce4f8ad590e3b7b36c6d5a2eab89cfc0ea9df7ca781b5e73c00c86d7 (image=quay.io/ceph/ceph:v18, name=ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mon-compute-0, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:47:30 compute-0 sudo[275365]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:47:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:47:30 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:30 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:30 compute-0 sudo[275602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:30 compute-0 sudo[275602]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:30 compute-0 sudo[275602]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:30 compute-0 sudo[275627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:47:30 compute-0 sudo[275627]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:30 compute-0 sudo[275627]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:30 compute-0 sudo[275652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:30 compute-0 sudo[275652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:30 compute-0 sudo[275652]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:31 compute-0 sudo[275677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Dec 01 09:47:31 compute-0 sudo[275677]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:31 compute-0 sudo[275677]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:47:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) v1
Dec 01 09:47:31 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) v1
Dec 01 09:47:31 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:31 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 442cf54a-cc02-4e02-a683-f372c98f4f1f does not exist
Dec 01 09:47:31 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 827f365c-a929-46fa-825d-62f09dd2d050 does not exist
Dec 01 09:47:31 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev 227a8cc2-5b7b-468f-9b7f-732a29910564 does not exist
Dec 01 09:47:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) v1
Dec 01 09:47:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0) v1
Dec 01 09:47:31 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:47:31 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:47:31 compute-0 sudo[275734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:31 compute-0 sudo[275734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:31 compute-0 sudo[275734]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:47:31 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:47:31 compute-0 sudo[275759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:47:31 compute-0 sudo[275759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:31 compute-0 sudo[275759]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:31 compute-0 sudo[275784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:31 compute-0 sudo[275784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:31 compute-0 sudo[275784]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:31 compute-0 sudo[275809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --yes --no-systemd
Dec 01 09:47:31 compute-0 sudo[275809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:31 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1110: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:32 compute-0 podman[275874]: 2025-12-01 09:47:32.271722083 +0000 UTC m=+0.049964327 container create 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:47:32 compute-0 systemd[1]: Started libpod-conmon-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope.
Dec 01 09:47:32 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:47:32 compute-0 podman[275874]: 2025-12-01 09:47:32.248874026 +0000 UTC m=+0.027116300 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:47:32 compute-0 podman[275874]: 2025-12-01 09:47:32.351434373 +0000 UTC m=+0.129676637 container init 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:47:32 compute-0 podman[275874]: 2025-12-01 09:47:32.3593061 +0000 UTC m=+0.137548334 container start 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:47:32 compute-0 podman[275874]: 2025-12-01 09:47:32.363029517 +0000 UTC m=+0.141271781 container attach 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:47:32 compute-0 optimistic_joliot[275890]: 167 167
Dec 01 09:47:32 compute-0 systemd[1]: libpod-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope: Deactivated successfully.
Dec 01 09:47:32 compute-0 conmon[275890]: conmon 8d145a496174f845254a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope/container/memory.events
Dec 01 09:47:32 compute-0 podman[275874]: 2025-12-01 09:47:32.366666701 +0000 UTC m=+0.144908945 container died 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Dec 01 09:47:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-19c18688748a935448676d260a50cbfc3d66bcefbe7df20c926aea50f1fe4a60-merged.mount: Deactivated successfully.
Dec 01 09:47:32 compute-0 podman[275874]: 2025-12-01 09:47:32.404658333 +0000 UTC m=+0.182900577 container remove 8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_joliot, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:47:32 compute-0 systemd[1]: libpod-conmon-8d145a496174f845254aacdd434d2904279855f9e8c147b885afb18808729efd.scope: Deactivated successfully.
Dec 01 09:47:32 compute-0 podman[275915]: 2025-12-01 09:47:32.603670872 +0000 UTC m=+0.052706176 container create 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:47:32 compute-0 systemd[1]: Started libpod-conmon-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope.
Dec 01 09:47:32 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:47:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:32 compute-0 podman[275915]: 2025-12-01 09:47:32.585787198 +0000 UTC m=+0.034822522 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:47:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:32 compute-0 podman[275915]: 2025-12-01 09:47:32.694401779 +0000 UTC m=+0.143437103 container init 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:47:32 compute-0 podman[275915]: 2025-12-01 09:47:32.701522204 +0000 UTC m=+0.150557508 container start 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Dec 01 09:47:32 compute-0 podman[275915]: 2025-12-01 09:47:32.804574705 +0000 UTC m=+0.253610009 container attach 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:47:32 compute-0 ceph-mon[75031]: pgmap v1110: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:33 compute-0 nova_compute[250706]: 2025-12-01 09:47:33.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:33 compute-0 nova_compute[250706]: 2025-12-01 09:47:33.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 09:47:33 compute-0 youthful_bose[275931]: --> passed data devices: 0 physical, 3 LVM
Dec 01 09:47:33 compute-0 youthful_bose[275931]: --> relative data size: 1.0
Dec 01 09:47:33 compute-0 youthful_bose[275931]: --> All data devices are unavailable
Dec 01 09:47:33 compute-0 systemd[1]: libpod-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope: Deactivated successfully.
Dec 01 09:47:33 compute-0 podman[275915]: 2025-12-01 09:47:33.908108489 +0000 UTC m=+1.357143803 container died 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef)
Dec 01 09:47:33 compute-0 systemd[1]: libpod-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope: Consumed 1.157s CPU time.
Dec 01 09:47:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-0e9e4f7ae612cc66ca0c7dbadd4d969a66bd3f07b93bb7968a61560e070b0124-merged.mount: Deactivated successfully.
Dec 01 09:47:33 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1111: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:33 compute-0 podman[275915]: 2025-12-01 09:47:33.985564865 +0000 UTC m=+1.434600169 container remove 20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_bose, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:47:33 compute-0 systemd[1]: libpod-conmon-20daa646978b05fdb57a2339eb07c1f3d19f08097c48dff0820e82112cb60285.scope: Deactivated successfully.
Dec 01 09:47:34 compute-0 sudo[275809]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:34 compute-0 sudo[275975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:34 compute-0 sudo[275975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:34 compute-0 sudo[275975]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:34 compute-0 sudo[276000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:47:34 compute-0 sudo[276000]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:34 compute-0 sudo[276000]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:34 compute-0 sudo[276025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:34 compute-0 sudo[276025]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:34 compute-0 sudo[276025]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:34 compute-0 sudo[276050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- lvm list --format json
Dec 01 09:47:34 compute-0 sudo[276050]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:34 compute-0 podman[276115]: 2025-12-01 09:47:34.793848272 +0000 UTC m=+0.071127645 container create 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:47:34 compute-0 systemd[1]: Started libpod-conmon-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope.
Dec 01 09:47:34 compute-0 podman[276115]: 2025-12-01 09:47:34.762846871 +0000 UTC m=+0.040126334 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:47:34 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:47:34 compute-0 podman[276115]: 2025-12-01 09:47:34.879951586 +0000 UTC m=+0.157230969 container init 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:47:34 compute-0 podman[276115]: 2025-12-01 09:47:34.892874808 +0000 UTC m=+0.170154221 container start 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:47:34 compute-0 podman[276115]: 2025-12-01 09:47:34.897400238 +0000 UTC m=+0.174679661 container attach 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:47:34 compute-0 clever_mcclintock[276132]: 167 167
Dec 01 09:47:34 compute-0 systemd[1]: libpod-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope: Deactivated successfully.
Dec 01 09:47:34 compute-0 conmon[276132]: conmon 2c4125b420df81671027 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope/container/memory.events
Dec 01 09:47:34 compute-0 podman[276115]: 2025-12-01 09:47:34.901359882 +0000 UTC m=+0.178639285 container died 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 09:47:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-9890bc63344c8753e91ea8205a958d432304f4c03b1a5d17e93264619b570da0-merged.mount: Deactivated successfully.
Dec 01 09:47:34 compute-0 podman[276115]: 2025-12-01 09:47:34.951807581 +0000 UTC m=+0.229086974 container remove 2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_mcclintock, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 01 09:47:34 compute-0 systemd[1]: libpod-conmon-2c4125b420df816710273a9a730eaae279d80133bb1e5d0290e4aa814699863a.scope: Deactivated successfully.
Dec 01 09:47:35 compute-0 ceph-mon[75031]: pgmap v1111: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:35 compute-0 podman[276158]: 2025-12-01 09:47:35.1556617 +0000 UTC m=+0.056107454 container create a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:47:35 compute-0 systemd[1]: Started libpod-conmon-a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7.scope.
Dec 01 09:47:35 compute-0 podman[276158]: 2025-12-01 09:47:35.131130735 +0000 UTC m=+0.031576569 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:47:35 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:47:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:35 compute-0 podman[276158]: 2025-12-01 09:47:35.269025268 +0000 UTC m=+0.169471092 container init a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Dec 01 09:47:35 compute-0 podman[276158]: 2025-12-01 09:47:35.286416778 +0000 UTC m=+0.186862532 container start a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 09:47:35 compute-0 podman[276158]: 2025-12-01 09:47:35.290524996 +0000 UTC m=+0.190970870 container attach a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:47:35 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:35 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1112: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]: {
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:     "0": [
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:         {
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "devices": [
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "/dev/loop3"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             ],
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_name": "ceph_lv0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_size": "21470642176",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cfc4d29-4b80-4e2d-94cb-e544135847a5,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "name": "ceph_lv0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "tags": {
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.block_uuid": "gaLirQ-JsUV-D9oP-gzUt-rR0A-M1TZ-f1d0Ny",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cluster_name": "ceph",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.crush_device_class": "",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.encrypted": "0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osd_fsid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osd_id": "0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.type": "block",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.vdo": "0"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             },
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "type": "block",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "vg_name": "ceph_vg0"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:         }
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:     ],
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:     "1": [
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:         {
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "devices": [
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "/dev/loop4"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             ],
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_name": "ceph_lv1",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_size": "21470642176",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b055e1b3-f94e-4d5e-be04-bafc3cd07aa2,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "name": "ceph_lv1",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "tags": {
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.block_uuid": "nHY1gY-XPvt-cSAL-f8uI-Wpec-F5TM-Yy84Dj",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cluster_name": "ceph",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.crush_device_class": "",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.encrypted": "0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osd_fsid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osd_id": "1",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.type": "block",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.vdo": "0"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             },
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "type": "block",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "vg_name": "ceph_vg1"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:         }
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:     ],
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:     "2": [
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:         {
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "devices": [
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "/dev/loop5"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             ],
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_name": "ceph_lv2",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_size": "21470642176",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5620a9fb-e540-5250-a0e8-7aaad5347e3b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c0c71a6c-e9f0-420a-90ae-6660eaf041be,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "lv_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "name": "ceph_lv2",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "tags": {
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.block_uuid": "QsmuN2-hZSH-jScn-zuQF-wTTe-sKYy-vhd1xf",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cluster_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.cluster_name": "ceph",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.crush_device_class": "",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.encrypted": "0",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osd_fsid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osd_id": "2",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.type": "block",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:                 "ceph.vdo": "0"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             },
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "type": "block",
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:             "vg_name": "ceph_vg2"
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:         }
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]:     ]
Dec 01 09:47:36 compute-0 peaceful_lehmann[276175]: }
Dec 01 09:47:36 compute-0 ceph-mon[75031]: pgmap v1112: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:36 compute-0 systemd[1]: libpod-a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7.scope: Deactivated successfully.
Dec 01 09:47:36 compute-0 podman[276158]: 2025-12-01 09:47:36.091888975 +0000 UTC m=+0.992334729 container died a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Dec 01 09:47:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-da46aa3578d6033f89ebc9e27fb2db5a2afa153cef22b88e2f43cc4e5fcfc4db-merged.mount: Deactivated successfully.
Dec 01 09:47:36 compute-0 podman[276158]: 2025-12-01 09:47:36.147247766 +0000 UTC m=+1.047693530 container remove a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:47:36 compute-0 systemd[1]: libpod-conmon-a2e5e72aa6805a9ed17066400563aed84467651f8750c0a0aaf5c07205278db7.scope: Deactivated successfully.
Dec 01 09:47:36 compute-0 sudo[276050]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:36 compute-0 sudo[276196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:36 compute-0 sudo[276196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:36 compute-0 sudo[276196]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:36 compute-0 sudo[276221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:47:36 compute-0 sudo[276221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:36 compute-0 sudo[276221]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:36 compute-0 sudo[276246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:36 compute-0 sudo[276246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:36 compute-0 sudo[276246]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:36 compute-0 sudo[276271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5620a9fb-e540-5250-a0e8-7aaad5347e3b/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 5620a9fb-e540-5250-a0e8-7aaad5347e3b -- raw list --format json
Dec 01 09:47:36 compute-0 sudo[276271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:36 compute-0 podman[276336]: 2025-12-01 09:47:36.913752883 +0000 UTC m=+0.060447868 container create 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:47:36 compute-0 systemd[1]: Started libpod-conmon-8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d.scope.
Dec 01 09:47:36 compute-0 podman[276336]: 2025-12-01 09:47:36.886411528 +0000 UTC m=+0.033106603 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:47:36 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:47:37 compute-0 podman[276336]: 2025-12-01 09:47:37.01212088 +0000 UTC m=+0.158815915 container init 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Dec 01 09:47:37 compute-0 podman[276336]: 2025-12-01 09:47:37.022267572 +0000 UTC m=+0.168962557 container start 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Dec 01 09:47:37 compute-0 podman[276336]: 2025-12-01 09:47:37.027364918 +0000 UTC m=+0.174059993 container attach 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Dec 01 09:47:37 compute-0 trusting_driscoll[276353]: 167 167
Dec 01 09:47:37 compute-0 systemd[1]: libpod-8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d.scope: Deactivated successfully.
Dec 01 09:47:37 compute-0 podman[276336]: 2025-12-01 09:47:37.030635852 +0000 UTC m=+0.177330847 container died 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:47:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-3dc38c1a22ac1b08a7dbb0a7d2d01b2dcf4d8c7211bc7e0635eb962152e43ace-merged.mount: Deactivated successfully.
Dec 01 09:47:37 compute-0 podman[276336]: 2025-12-01 09:47:37.089209826 +0000 UTC m=+0.235904821 container remove 8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_driscoll, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:47:37 compute-0 systemd[1]: libpod-conmon-8dc2e8ce9d17ef6f19fd18c6dd93b32eb10646fc86339d545bf194add700b49d.scope: Deactivated successfully.
Dec 01 09:47:37 compute-0 podman[276358]: 2025-12-01 09:47:37.149281432 +0000 UTC m=+0.077396485 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 09:47:37 compute-0 podman[276395]: 2025-12-01 09:47:37.309215608 +0000 UTC m=+0.058804011 container create 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Dec 01 09:47:37 compute-0 systemd[1]: Started libpod-conmon-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope.
Dec 01 09:47:37 compute-0 podman[276395]: 2025-12-01 09:47:37.282743507 +0000 UTC m=+0.032331950 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Dec 01 09:47:37 compute-0 systemd[1]: Started libcrun container.
Dec 01 09:47:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:47:37 compute-0 podman[276395]: 2025-12-01 09:47:37.420428824 +0000 UTC m=+0.170017317 container init 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Dec 01 09:47:37 compute-0 podman[276395]: 2025-12-01 09:47:37.432746368 +0000 UTC m=+0.182334801 container start 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:47:37 compute-0 podman[276395]: 2025-12-01 09:47:37.437163635 +0000 UTC m=+0.186752128 container attach 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:47:37 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1113: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:38 compute-0 strange_jackson[276412]: {
Dec 01 09:47:38 compute-0 strange_jackson[276412]:     "9cfc4d29-4b80-4e2d-94cb-e544135847a5": {
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "osd_id": 0,
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "osd_uuid": "9cfc4d29-4b80-4e2d-94cb-e544135847a5",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "type": "bluestore"
Dec 01 09:47:38 compute-0 strange_jackson[276412]:     },
Dec 01 09:47:38 compute-0 strange_jackson[276412]:     "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2": {
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "osd_id": 1,
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "osd_uuid": "b055e1b3-f94e-4d5e-be04-bafc3cd07aa2",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "type": "bluestore"
Dec 01 09:47:38 compute-0 strange_jackson[276412]:     },
Dec 01 09:47:38 compute-0 strange_jackson[276412]:     "c0c71a6c-e9f0-420a-90ae-6660eaf041be": {
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "ceph_fsid": "5620a9fb-e540-5250-a0e8-7aaad5347e3b",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "device": "/dev/mapper/ceph_vg2-ceph_lv2",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "osd_id": 2,
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "osd_uuid": "c0c71a6c-e9f0-420a-90ae-6660eaf041be",
Dec 01 09:47:38 compute-0 strange_jackson[276412]:         "type": "bluestore"
Dec 01 09:47:38 compute-0 strange_jackson[276412]:     }
Dec 01 09:47:38 compute-0 strange_jackson[276412]: }
Dec 01 09:47:38 compute-0 systemd[1]: libpod-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope: Deactivated successfully.
Dec 01 09:47:38 compute-0 systemd[1]: libpod-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope: Consumed 1.055s CPU time.
Dec 01 09:47:38 compute-0 podman[276395]: 2025-12-01 09:47:38.480023593 +0000 UTC m=+1.229612006 container died 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Dec 01 09:47:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-fd75a5b791990d0f684e7959f5318135e59f09956fc88fb02c308bf1fdb26fb3-merged.mount: Deactivated successfully.
Dec 01 09:47:38 compute-0 podman[276395]: 2025-12-01 09:47:38.538127203 +0000 UTC m=+1.287715616 container remove 3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_jackson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True)
Dec 01 09:47:38 compute-0 systemd[1]: libpod-conmon-3e1cf89c5b983b7c82f3eb80945e357a90a80fdd42dbbdfc4341f1cf45b21716.scope: Deactivated successfully.
Dec 01 09:47:38 compute-0 sudo[276271]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0) v1
Dec 01 09:47:38 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:38 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0) v1
Dec 01 09:47:38 compute-0 ceph-mon[75031]: log_channel(audit) log [INF] : from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:38 compute-0 ceph-mgr[75324]: [progress WARNING root] complete: ev cf5f88bc-022c-4c4c-a2c3-177e04ebd4d0 does not exist
Dec 01 09:47:38 compute-0 sudo[276456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:47:38 compute-0 sudo[276456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:38 compute-0 sudo[276456]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:38 compute-0 sudo[276481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:47:38 compute-0 sudo[276481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:38 compute-0 sudo[276481]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:39 compute-0 ceph-mon[75031]: pgmap v1113: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:39 compute-0 ceph-mon[75031]: from='mgr.14132 192.168.122.100:0/1612915802' entity='mgr.compute-0.psduho' 
Dec 01 09:47:39 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1114: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:40 compute-0 ceph-mon[75031]: pgmap v1114: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:40 compute-0 nova_compute[250706]: 2025-12-01 09:47:40.067 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:40 compute-0 nova_compute[250706]: 2025-12-01 09:47:40.068 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:40 compute-0 nova_compute[250706]: 2025-12-01 09:47:40.068 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:40 compute-0 nova_compute[250706]: 2025-12-01 09:47:40.068 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:40 compute-0 nova_compute[250706]: 2025-12-01 09:47:40.069 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:40 compute-0 nova_compute[250706]: 2025-12-01 09:47:40.069 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 09:47:40 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:41 compute-0 nova_compute[250706]: 2025-12-01 09:47:41.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:41 compute-0 nova_compute[250706]: 2025-12-01 09:47:41.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 09:47:41 compute-0 nova_compute[250706]: 2025-12-01 09:47:41.054 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 09:47:41 compute-0 nova_compute[250706]: 2025-12-01 09:47:41.071 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 09:47:41 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1115: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:43 compute-0 ceph-mon[75031]: pgmap v1115: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:43 compute-0 nova_compute[250706]: 2025-12-01 09:47:43.052 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:47:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:47:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:47:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:47:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:47:43 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:47:43 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1116: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:44 compute-0 nova_compute[250706]: 2025-12-01 09:47:44.053 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:44 compute-0 nova_compute[250706]: 2025-12-01 09:47:44.054 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:44 compute-0 ceph-mon[75031]: pgmap v1116: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Dec 01 09:47:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1514233815' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:47:44 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Dec 01 09:47:44 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1514233815' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:47:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1514233815' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 09:47:45 compute-0 ceph-mon[75031]: from='client.? 192.168.122.10:0/1514233815' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 09:47:45 compute-0 nova_compute[250706]: 2025-12-01 09:47:45.113 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:45 compute-0 nova_compute[250706]: 2025-12-01 09:47:45.114 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 09:47:45 compute-0 nova_compute[250706]: 2025-12-01 09:47:45.136 250710 DEBUG nova.compute.manager [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 09:47:45 compute-0 sshd-session[276506]: Accepted publickey for zuul from 192.168.122.10 port 40086 ssh2: ECDSA SHA256:qeYLzcMt7u+aIXWvxTim9ZQrhR80x0VMZgLc05vjQmw
Dec 01 09:47:45 compute-0 systemd-logind[788]: New session 55 of user zuul.
Dec 01 09:47:45 compute-0 systemd[1]: Started Session 55 of User zuul.
Dec 01 09:47:45 compute-0 sshd-session[276506]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:47:45 compute-0 sudo[276510]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 01 09:47:45 compute-0 sudo[276510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:47:45 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:45 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1117: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:46 compute-0 ceph-mon[75031]: pgmap v1117: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.075 250710 DEBUG oslo_service.periodic_task [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.116 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.117 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.117 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:47:47 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:47:47 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1622289794' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.627 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:47:47 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1622289794' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.812 250710 WARNING nova.virt.libvirt.driver [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.813 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5128MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.814 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 09:47:47 compute-0 nova_compute[250706]: 2025-12-01 09:47:47.814 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 09:47:47 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1118: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.126 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.127 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 09:47:48 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15014 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.237 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing inventories for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.364 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating ProviderTree inventory for provider 847e3dbe-0f76-4032-a374-8c965945c22f from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.365 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Updating inventory in ProviderTree for provider 847e3dbe-0f76-4032-a374-8c965945c22f with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.418 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing aggregate associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.457 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Refreshing trait associations for resource provider 847e3dbe-0f76-4032-a374-8c965945c22f, traits: COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,HW_CPU_X86_AESNI,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_BMI2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.495 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 09:47:48 compute-0 ceph-mon[75031]: pgmap v1118: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:48 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15016 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:48 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Dec 01 09:47:48 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1997129487' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.978 250710 DEBUG oslo_concurrency.processutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 09:47:48 compute-0 nova_compute[250706]: 2025-12-01 09:47:48.984 250710 DEBUG nova.compute.provider_tree [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed in ProviderTree for provider: 847e3dbe-0f76-4032-a374-8c965945c22f update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 09:47:49 compute-0 nova_compute[250706]: 2025-12-01 09:47:49.011 250710 DEBUG nova.scheduler.client.report [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Inventory has not changed for provider 847e3dbe-0f76-4032-a374-8c965945c22f based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 09:47:49 compute-0 nova_compute[250706]: 2025-12-01 09:47:49.012 250710 DEBUG nova.compute.resource_tracker [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 09:47:49 compute-0 nova_compute[250706]: 2025-12-01 09:47:49.012 250710 DEBUG oslo_concurrency.lockutils [None req-11ed26ab-fdbe-4d24-b7e1-7cdbe03069eb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 09:47:49 compute-0 podman[276758]: 2025-12-01 09:47:49.049161653 +0000 UTC m=+0.139774698 container health_status 34cd858183308124099e4d45b7ab29ba8857dadd09b02bcef3546777e1d5961c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 01 09:47:49 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0) v1
Dec 01 09:47:49 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/150026497' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:47:49 compute-0 ceph-mon[75031]: from='client.15014 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:49 compute-0 ceph-mon[75031]: from='client.15016 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:49 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1997129487' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 09:47:49 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/150026497' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 09:47:49 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1119: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:50 compute-0 ceph-mon[75031]: pgmap v1119: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:50 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:51 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1120: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:52 compute-0 ovs-vsctl[276842]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 01 09:47:53 compute-0 podman[276889]: 2025-12-01 09:47:53.015034561 +0000 UTC m=+0.106229073 container health_status 195c4e3b331516248db94ab4fd2bd7de7fed07ea929bf040e5f817f87d021dba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 01 09:47:53 compute-0 ceph-mon[75031]: pgmap v1120: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:53 compute-0 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 01 09:47:53 compute-0 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 01 09:47:53 compute-0 virtqemud[250400]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 01 09:47:53 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: cache status {prefix=cache status} (starting...)
Dec 01 09:47:53 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1121: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:54 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: client ls {prefix=client ls} (starting...)
Dec 01 09:47:54 compute-0 ceph-mon[75031]: pgmap v1121: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:54 compute-0 lvm[277184]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:47:54 compute-0 lvm[277184]: VG ceph_vg0 finished
Dec 01 09:47:54 compute-0 lvm[277192]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 01 09:47:54 compute-0 lvm[277192]: VG ceph_vg2 finished
Dec 01 09:47:54 compute-0 lvm[277196]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 01 09:47:54 compute-0 lvm[277196]: VG ceph_vg1 finished
Dec 01 09:47:54 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15022 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:54 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: damage ls {prefix=damage ls} (starting...)
Dec 01 09:47:54 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump loads {prefix=dump loads} (starting...)
Dec 01 09:47:54 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15024 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:55 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 01 09:47:55 compute-0 ceph-mon[75031]: from='client.15022 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:55 compute-0 ceph-mon[75031]: from='client.15024 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:55 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 01 09:47:55 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 01 09:47:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "report"} v 0) v1
Dec 01 09:47:55 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1567427368' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 09:47:55 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 01 09:47:55 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15030 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:55 compute-0 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:47:55 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:47:55.663+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:47:55 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 01 09:47:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Dec 01 09:47:55 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2474296016' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:47:55 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:47:55 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 01 09:47:55 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1122: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:56 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1567427368' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mon[75031]: from='client.15030 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2474296016' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mon[75031]: pgmap v1122: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:56 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: ops {prefix=ops} (starting...)
Dec 01 09:47:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config log"} v 0) v1
Dec 01 09:47:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3686709795' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Dec 01 09:47:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2621865607' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Dec 01 09:47:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4178876128' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec 01 09:47:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3725325385' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: session ls {prefix=session ls} (starting...)
Dec 01 09:47:56 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15042 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:56 compute-0 ceph-mds[98608]: mds.cephfs.compute-0.hrlhzj asok_command: status {prefix=status} (starting...)
Dec 01 09:47:56 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec 01 09:47:56 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1138620584' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3686709795' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2621865607' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4178876128' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3725325385' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: from='client.15042 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1138620584' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15046 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec 01 09:47:57 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4113522476' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "features"} v 0) v1
Dec 01 09:47:57 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/872774579' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec 01 09:47:57 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/233936803' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:47:57 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1123: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:58 compute-0 ceph-mon[75031]: from='client.15046 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:58 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4113522476' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:47:58 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/872774579' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 09:47:58 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/233936803' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:47:58 compute-0 ceph-mon[75031]: pgmap v1123: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:47:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Dec 01 09:47:58 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/490956330' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 09:47:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Dec 01 09:47:58 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3674695074' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 09:47:58 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15058 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:58 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:47:58.532+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 01 09:47:58 compute-0 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 01 09:47:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec 01 09:47:58 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2448350616' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:47:58 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Dec 01 09:47:58 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/566368364' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15064 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/490956330' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3674695074' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mon[75031]: from='client.15058 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2448350616' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/566368364' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Dec 01 09:47:59 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2541752812' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15068 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Dec 01 09:47:59 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1199316515' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15072 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:47:59 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1124: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:00 compute-0 ceph-mon[75031]: from='client.15064 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:00 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2541752812' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 09:48:00 compute-0 ceph-mon[75031]: from='client.15068 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:00 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1199316515' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 09:48:00 compute-0 ceph-mon[75031]: from='client.15072 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:00 compute-0 ceph-mon[75031]: pgmap v1124: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:00 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15075 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Dec 01 09:48:00 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3252037160' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:16.564973+0000 osd.2 (osd.2) 41 : cluster [DBG] 5.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 41) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:16.550889+0000 osd.2 (osd.2) 40 : cluster [DBG] 5.1c scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:16.564973+0000 osd.2 (osd.2) 41 : cluster [DBG] 5.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:48.219822+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 43 sent 41 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:17.588748+0000 osd.2 (osd.2) 42 : cluster [DBG] 5.1f scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:17.602848+0000 osd.2 (osd.2) 43 : cluster [DBG] 5.1f scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 43) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:17.588748+0000 osd.2 (osd.2) 42 : cluster [DBG] 5.1f scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:17.602848+0000 osd.2 (osd.2) 43 : cluster [DBG] 5.1f scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:49.220014+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361309 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57327616 unmapped: 1335296 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:50.220174+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:51.220388+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 45 sent 43 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:20.603951+0000 osd.2 (osd.2) 44 : cluster [DBG] 4.18 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:20.617993+0000 osd.2 (osd.2) 45 : cluster [DBG] 4.18 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 45) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:20.603951+0000 osd.2 (osd.2) 44 : cluster [DBG] 4.18 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:20.617993+0000 osd.2 (osd.2) 45 : cluster [DBG] 4.18 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57344000 unmapped: 1318912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:52.220657+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:53.220860+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 47 sent 45 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:22.648231+0000 osd.2 (osd.2) 46 : cluster [DBG] 4.1b scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:22.662347+0000 osd.2 (osd.2) 47 : cluster [DBG] 4.1b scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 47) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:22.648231+0000 osd.2 (osd.2) 46 : cluster [DBG] 4.1b scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:22.662347+0000 osd.2 (osd.2) 47 : cluster [DBG] 4.1b scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:54.221102+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 363605 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57352192 unmapped: 1310720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:55.221321+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 49 sent 47 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:24.669734+0000 osd.2 (osd.2) 48 : cluster [DBG] 4.1a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:24.683799+0000 osd.2 (osd.2) 49 : cluster [DBG] 4.1a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 49) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:24.669734+0000 osd.2 (osd.2) 48 : cluster [DBG] 4.1a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:24.683799+0000 osd.2 (osd.2) 49 : cluster [DBG] 4.1a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:56.221654+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.919653893s of 10.155382156s, submitted: 10
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57368576 unmapped: 1294336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:57.221876+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 51 sent 49 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:26.706269+0000 osd.2 (osd.2) 50 : cluster [DBG] 6.f scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:26.723859+0000 osd.2 (osd.2) 51 : cluster [DBG] 6.f scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:58.222129+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 51) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:26.706269+0000 osd.2 (osd.2) 50 : cluster [DBG] 6.f scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:26.723859+0000 osd.2 (osd.2) 51 : cluster [DBG] 6.f scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:59.222273+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 365900 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57376768 unmapped: 1286144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:00.222431+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:01.222587+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57384960 unmapped: 1277952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:02.222790+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:03.222963+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57393152 unmapped: 1269760 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:04.223273+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:33.790932+0000 osd.2 (osd.2) 52 : cluster [DBG] 4.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:33.805014+0000 osd.2 (osd.2) 53 : cluster [DBG] 4.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 53) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:33.790932+0000 osd.2 (osd.2) 52 : cluster [DBG] 4.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:33.805014+0000 osd.2 (osd.2) 53 : cluster [DBG] 4.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:05.223668+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:06.223924+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57401344 unmapped: 1261568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:07.224093+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:08.224528+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57409536 unmapped: 1253376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:09.224696+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:10.224873+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57417728 unmapped: 1245184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:11.225036+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57425920 unmapped: 1236992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:12.225212+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:13.225349+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57434112 unmapped: 1228800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:14.225489+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 367047 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:15.225646+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:16.225790+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57442304 unmapped: 1220608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:17.225967+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.934640884s of 20.948490143s, submitted: 4
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57450496 unmapped: 1212416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:18.226165+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:47.654642+0000 osd.2 (osd.2) 54 : cluster [DBG] 4.a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:47.668603+0000 osd.2 (osd.2) 55 : cluster [DBG] 4.a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 55) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:47.654642+0000 osd.2 (osd.2) 54 : cluster [DBG] 4.a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:47.668603+0000 osd.2 (osd.2) 55 : cluster [DBG] 4.a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:19.226796+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368194 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57458688 unmapped: 1204224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:20.226924+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:21.227082+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57466880 unmapped: 1196032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:22.227220+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57483264 unmapped: 1179648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:23.227352+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:52.628412+0000 osd.2 (osd.2) 56 : cluster [DBG] 6.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:52.642469+0000 osd.2 (osd.2) 57 : cluster [DBG] 6.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57491456 unmapped: 1171456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:24.227556+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 4 last_log 59 sent 57 num 4 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:53.669273+0000 osd.2 (osd.2) 58 : cluster [DBG] 6.14 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:53.697496+0000 osd.2 (osd.2) 59 : cluster [DBG] 6.14 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 57) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:52.628412+0000 osd.2 (osd.2) 56 : cluster [DBG] 6.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:52.642469+0000 osd.2 (osd.2) 57 : cluster [DBG] 6.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 59) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:53.669273+0000 osd.2 (osd.2) 58 : cluster [DBG] 6.14 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:53.697496+0000 osd.2 (osd.2) 59 : cluster [DBG] 6.14 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 371637 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57499648 unmapped: 1163264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:25.227748+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:54.623675+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.13 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:54.637735+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.13 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 61) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:54.623675+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.13 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:54.637735+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.13 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57507840 unmapped: 1155072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:26.228031+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:55.581391+0000 osd.2 (osd.2) 62 : cluster [DBG] 6.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:55.595457+0000 osd.2 (osd.2) 63 : cluster [DBG] 6.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 63) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:55.581391+0000 osd.2 (osd.2) 62 : cluster [DBG] 6.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:55.595457+0000 osd.2 (osd.2) 63 : cluster [DBG] 6.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:27.228333+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:28.228523+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:29.228704+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.327646255s of 11.902298927s, submitted: 10
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373933 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57532416 unmapped: 1130496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:30.228871+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:59.557143+0000 osd.2 (osd.2) 64 : cluster [DBG] 4.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:16:59.571187+0000 osd.2 (osd.2) 65 : cluster [DBG] 4.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 65) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:59.557143+0000 osd.2 (osd.2) 64 : cluster [DBG] 4.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:16:59.571187+0000 osd.2 (osd.2) 65 : cluster [DBG] 4.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:31.229251+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:32.229527+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:01.574587+0000 osd.2 (osd.2) 66 : cluster [DBG] 6.13 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:01.595683+0000 osd.2 (osd.2) 67 : cluster [DBG] 6.13 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 67) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:01.574587+0000 osd.2 (osd.2) 66 : cluster [DBG] 6.13 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:01.595683+0000 osd.2 (osd.2) 67 : cluster [DBG] 6.13 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:33.229782+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:02.559596+0000 osd.2 (osd.2) 68 : cluster [DBG] 6.15 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:02.580464+0000 osd.2 (osd.2) 69 : cluster [DBG] 6.15 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 6.1f deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 69) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:02.559596+0000 osd.2 (osd.2) 68 : cluster [DBG] 6.15 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:02.580464+0000 osd.2 (osd.2) 69 : cluster [DBG] 6.15 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57556992 unmapped: 1105920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:34.230033+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:03.582987+0000 osd.2 (osd.2) 70 : cluster [DBG] 6.1f deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:03.604146+0000 osd.2 (osd.2) 71 : cluster [DBG] 6.1f deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 377377 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 71) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:03.582987+0000 osd.2 (osd.2) 70 : cluster [DBG] 6.1f deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:03.604146+0000 osd.2 (osd.2) 71 : cluster [DBG] 6.1f deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:35.230254+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57516032 unmapped: 1146880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:36.230354+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:37.230503+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:38.230737+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:07.589006+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1c scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:07.603094+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57524224 unmapped: 1138688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 73) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:07.589006+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1c scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:07.603094+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:39.231019+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 378525 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57540608 unmapped: 1122304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:40.231207+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57548800 unmapped: 1114112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:41.231432+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:42.231635+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:43.231789+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57565184 unmapped: 1097728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:44.231967+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.905238152s of 15.089574814s, submitted: 10
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 379673 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:45.232124+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:14.646727+0000 osd.2 (osd.2) 74 : cluster [DBG] 3.18 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:14.660813+0000 osd.2 (osd.2) 75 : cluster [DBG] 3.18 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 75) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:14.646727+0000 osd.2 (osd.2) 74 : cluster [DBG] 3.18 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:14.660813+0000 osd.2 (osd.2) 75 : cluster [DBG] 3.18 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:46.232369+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:15.630119+0000 osd.2 (osd.2) 76 : cluster [DBG] 7.1c scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:15.644206+0000 osd.2 (osd.2) 77 : cluster [DBG] 7.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57589760 unmapped: 1073152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 77) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:15.630119+0000 osd.2 (osd.2) 76 : cluster [DBG] 7.1c scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:15.644206+0000 osd.2 (osd.2) 77 : cluster [DBG] 7.1c scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:47.232569+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:48.232806+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57597952 unmapped: 1064960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:49.233013+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 380821 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57606144 unmapped: 1056768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:50.233185+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57614336 unmapped: 1048576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:51.233384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:20.630324+0000 osd.2 (osd.2) 78 : cluster [DBG] 3.16 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:20.644341+0000 osd.2 (osd.2) 79 : cluster [DBG] 3.16 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 79) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:20.630324+0000 osd.2 (osd.2) 78 : cluster [DBG] 3.16 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:20.644341+0000 osd.2 (osd.2) 79 : cluster [DBG] 3.16 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:52.233620+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:53.233817+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57622528 unmapped: 1040384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:54.233987+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 381969 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:55.234139+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57630720 unmapped: 1032192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:56.234380+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:57.234568+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57638912 unmapped: 1024000 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:58.234887+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.969743729s of 13.996441841s, submitted: 6
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:59.235195+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:28.643193+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:28.657255+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383117 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 81) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:28.643193+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:28.657255+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:00.235541+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57655296 unmapped: 1007616 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:01.235745+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57663488 unmapped: 999424 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:02.235892+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57671680 unmapped: 991232 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:03.236073+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57679872 unmapped: 983040 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:04.236248+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:33.523155+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.15 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:33.537185+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.15 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 384265 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 83) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:33.523155+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.15 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:33.537185+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.15 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:05.236533+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57688064 unmapped: 974848 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:06.236729+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:07.236900+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:36.449907+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:36.464074+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57696256 unmapped: 966656 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 85) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:36.449907+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.11 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:36.464074+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.11 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:08.237237+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:09.237381+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:38.488088+0000 osd.2 (osd.2) 86 : cluster [DBG] 3.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:38.502194+0000 osd.2 (osd.2) 87 : cluster [DBG] 3.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386560 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57704448 unmapped: 958464 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 87) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:38.488088+0000 osd.2 (osd.2) 86 : cluster [DBG] 3.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:38.502194+0000 osd.2 (osd.2) 87 : cluster [DBG] 3.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:10.237574+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57712640 unmapped: 950272 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:11.237731+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57720832 unmapped: 942080 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:12.237902+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.829962730s of 13.884933472s, submitted: 8
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57745408 unmapped: 917504 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:13.238091+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:42.528237+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:42.542392+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 89) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:42.528237+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:42.542392+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:14.238344+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 387707 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:15.238498+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57761792 unmapped: 901120 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:16.238665+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:17.238817+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57769984 unmapped: 892928 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:18.239003+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:19.239186+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388854 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:20.239367+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:49.596230+0000 osd.2 (osd.2) 90 : cluster [DBG] 7.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:49.610164+0000 osd.2 (osd.2) 91 : cluster [DBG] 7.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 91) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:49.596230+0000 osd.2 (osd.2) 90 : cluster [DBG] 7.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:49.610164+0000 osd.2 (osd.2) 91 : cluster [DBG] 7.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57778176 unmapped: 884736 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:21.239838+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57786368 unmapped: 876544 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:22.240031+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.005264282s of 10.019852638s, submitted: 4
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:23.240166+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:52.548059+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.5 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:52.562164+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.5 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 93) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:52.548059+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.5 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:52.562164+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.5 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57794560 unmapped: 868352 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:24.240337+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 390001 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:25.240531+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57802752 unmapped: 860160 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:26.240730+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57819136 unmapped: 843776 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:27.240954+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:56.516094+0000 osd.2 (osd.2) 94 : cluster [DBG] 7.2 deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:56.530250+0000 osd.2 (osd.2) 95 : cluster [DBG] 7.2 deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 95) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:56.516094+0000 osd.2 (osd.2) 94 : cluster [DBG] 7.2 deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:56.530250+0000 osd.2 (osd.2) 95 : cluster [DBG] 7.2 deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57827328 unmapped: 835584 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:28.241601+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:29.241817+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:58.460358+0000 osd.2 (osd.2) 96 : cluster [DBG] 3.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:58.474342+0000 osd.2 (osd.2) 97 : cluster [DBG] 3.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 97) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:58.460358+0000 osd.2 (osd.2) 96 : cluster [DBG] 3.8 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:58.474342+0000 osd.2 (osd.2) 97 : cluster [DBG] 3.8 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.c deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 393442 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57843712 unmapped: 819200 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:30.242180+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:59.466738+0000 osd.2 (osd.2) 98 : cluster [DBG] 7.c deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:17:59.480846+0000 osd.2 (osd.2) 99 : cluster [DBG] 7.c deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 99) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:59.466738+0000 osd.2 (osd.2) 98 : cluster [DBG] 7.c deep-scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:17:59.480846+0000 osd.2 (osd.2) 99 : cluster [DBG] 7.c deep-scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:31.242584+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:00.429138+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.7 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:00.443057+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.7 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 101) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:00.429138+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.7 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:00.443057+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.7 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:32.242834+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57851904 unmapped: 811008 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:33.242973+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57860096 unmapped: 802816 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:34.243597+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 394589 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:35.243764+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57868288 unmapped: 794624 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:36.244002+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:37.244331+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57876480 unmapped: 786432 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:38.244751+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.707541466s of 15.794960022s, submitted: 10
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:39.245451+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:08.343347+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.1a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:08.357321+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.1a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 103) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:08.343347+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.1a scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:08.357321+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.1a scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 395737 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:40.245886+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57884672 unmapped: 778240 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:41.246007+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57892864 unmapped: 770048 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:42.246167+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57909248 unmapped: 753664 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:43.246310+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:12.300849+0000 osd.2 (osd.2) 104 : cluster [DBG] 7.1 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:12.314935+0000 osd.2 (osd.2) 105 : cluster [DBG] 7.1 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 105) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:12.300849+0000 osd.2 (osd.2) 104 : cluster [DBG] 7.1 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:12.314935+0000 osd.2 (osd.2) 105 : cluster [DBG] 7.1 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:44.246625+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:13.349355+0000 osd.2 (osd.2) 106 : cluster [DBG] 3.1d scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:13.363458+0000 osd.2 (osd.2) 107 : cluster [DBG] 3.1d scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398032 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 107) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:13.349355+0000 osd.2 (osd.2) 106 : cluster [DBG] 3.1d scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:13.363458+0000 osd.2 (osd.2) 107 : cluster [DBG] 3.1d scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:45.246842+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57917440 unmapped: 745472 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:46.247004+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:47.247557+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:16.401251+0000 osd.2 (osd.2) 108 : cluster [DBG] 3.5 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:16.415372+0000 osd.2 (osd.2) 109 : cluster [DBG] 3.5 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 109) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:16.401251+0000 osd.2 (osd.2) 108 : cluster [DBG] 3.5 scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:16.415372+0000 osd.2 (osd.2) 109 : cluster [DBG] 3.5 scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:48.247791+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:17.371499+0000 osd.2 (osd.2) 110 : cluster [DBG] 3.1e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:17.385573+0000 osd.2 (osd.2) 111 : cluster [DBG] 3.1e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57925632 unmapped: 737280 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 111) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:17.371499+0000 osd.2 (osd.2) 110 : cluster [DBG] 3.1e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:17.385573+0000 osd.2 (osd.2) 111 : cluster [DBG] 3.1e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:49.248087+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400327 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57933824 unmapped: 729088 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:50.248284+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57942016 unmapped: 720896 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:51.248430+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57950208 unmapped: 712704 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:52.248583+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.057037354s of 14.124808311s, submitted: 10
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:53.248757+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:22.467843+0000 osd.2 (osd.2) 112 : cluster [DBG] 7.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  will send 2025-12-01T09:18:22.481991+0000 osd.2 (osd.2) 113 : cluster [DBG] 7.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57958400 unmapped: 704512 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client handle_log_ack log(last 113) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:22.467843+0000 osd.2 (osd.2) 112 : cluster [DBG] 7.e scrub starts
Dec 01 09:48:00 compute-0 ceph-osd[90166]: log_client  logged 2025-12-01T09:18:22.481991+0000 osd.2 (osd.2) 113 : cluster [DBG] 7.e scrub ok
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:54.248955+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:55.249099+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57966592 unmapped: 696320 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:56.249253+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:57.249524+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57982976 unmapped: 679936 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:58.249738+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:59.249884+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:00.250050+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57991168 unmapped: 671744 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:01.250334+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:02.250641+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 57999360 unmapped: 663552 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:03.250932+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:04.251092+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58007552 unmapped: 655360 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:05.251254+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58015744 unmapped: 647168 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:06.251411+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:07.251690+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58023936 unmapped: 638976 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:08.252098+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:09.252378+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:10.252606+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58032128 unmapped: 630784 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:11.252767+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:12.253269+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:13.253586+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:14.254156+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:15.254497+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58040320 unmapped: 622592 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:16.254701+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:17.254907+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:18.255595+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58048512 unmapped: 614400 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:19.255792+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:20.255960+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58056704 unmapped: 606208 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:21.256171+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58064896 unmapped: 598016 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:22.256363+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58073088 unmapped: 589824 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:23.256514+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:24.256657+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:25.256839+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58081280 unmapped: 581632 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:26.257012+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:27.257230+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58089472 unmapped: 573440 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:28.257457+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:29.257589+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:30.257853+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58105856 unmapped: 557056 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:31.258101+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:32.258255+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:33.258419+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58114048 unmapped: 548864 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:34.258570+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58122240 unmapped: 540672 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:35.258732+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58130432 unmapped: 532480 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:36.258890+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:37.259054+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:38.259275+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58138624 unmapped: 524288 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:39.259419+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:40.259717+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58146816 unmapped: 516096 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:41.260086+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58155008 unmapped: 507904 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:42.260235+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58163200 unmapped: 499712 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:43.260384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:44.260664+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:45.260947+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58171392 unmapped: 491520 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:46.261169+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:47.261383+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58179584 unmapped: 483328 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:48.261654+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:49.261808+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:50.261992+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58187776 unmapped: 475136 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:51.262186+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:52.262347+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:53.262516+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58195968 unmapped: 466944 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:54.262753+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:55.262898+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58204160 unmapped: 458752 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:56.263101+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:57.263338+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58212352 unmapped: 450560 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:58.263638+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:59.263842+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:00.264059+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58220544 unmapped: 442368 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:01.264281+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:02.264631+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:03.264869+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58228736 unmapped: 434176 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:04.265023+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:05.265196+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 425984 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:06.265445+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 417792 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:07.265682+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:08.266042+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58253312 unmapped: 409600 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:09.266480+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:10.266674+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 401408 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:11.266870+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:12.267060+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:13.267367+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58269696 unmapped: 393216 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:14.267508+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58277888 unmapped: 385024 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:15.267734+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58286080 unmapped: 376832 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:16.267891+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:17.268108+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:18.268332+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 368640 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:19.268553+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:20.269490+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 360448 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:21.269666+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:22.269860+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:23.270033+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 352256 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:24.270210+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 344064 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:25.270377+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58327040 unmapped: 335872 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:26.270539+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:27.271285+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58335232 unmapped: 327680 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:28.271624+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:29.271812+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58343424 unmapped: 319488 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:30.271986+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:31.272162+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:32.272399+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58359808 unmapped: 303104 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:33.272618+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:34.272875+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 294912 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:35.273122+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 286720 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:36.273278+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:37.273449+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 278528 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:38.273632+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:39.273812+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:40.274064+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 270336 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:41.274206+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:42.274470+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 262144 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:43.274701+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:44.274864+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 253952 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:45.275135+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 237568 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:46.275349+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:47.275518+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 229376 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:48.275703+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:49.275847+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:50.276009+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 221184 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:51.276376+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:52.276704+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:53.276955+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 212992 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:54.277172+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:55.277429+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 204800 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:56.277677+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:57.277913+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:58.278171+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 196608 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:59.278427+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:00.278658+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 188416 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:01.278871+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:02.283786+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 180224 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:03.283914+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:04.284067+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 172032 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:05.284250+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:06.284514+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:07.284794+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:08.284994+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:09.285348+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:10.285634+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:11.285842+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:12.286087+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:13.286262+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:14.286442+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:15.286609+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:16.286746+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 155648 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:17.286890+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:18.287121+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:19.287348+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:20.287507+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:21.287759+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 147456 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:22.287923+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:23.288121+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 139264 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:24.288415+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:25.288582+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:26.288800+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 131072 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:27.289031+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:28.289215+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:29.289423+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 122880 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:30.289619+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:31.289823+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 114688 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:32.290004+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:33.290232+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 106496 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:34.291016+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:35.291205+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:36.291408+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58564608 unmapped: 98304 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:37.291655+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:38.291908+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 90112 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:39.292132+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:40.292365+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:41.292583+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 81920 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:42.292796+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 73728 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:43.292936+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 65536 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:44.293103+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:45.293405+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:46.293539+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 57344 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:47.293682+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:48.293985+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 49152 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:49.294173+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:50.294354+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 40960 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:51.294501+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:52.294647+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:53.294869+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 32768 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:54.295024+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:55.295229+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:56.295402+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 24576 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:57.295613+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:58.295840+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 16384 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:59.295999+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:00.296453+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 8192 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:01.296618+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:02.296778+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:03.296941+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 0 heap: 58662912 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:04.297153+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:05.297324+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:06.297504+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:07.297734+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:08.297992+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58679296 unmapped: 1032192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:09.298174+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:10.298366+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:11.298564+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:12.298715+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:13.298870+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:14.299020+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:15.299202+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:16.299354+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:17.299611+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:18.299842+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:19.300421+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58712064 unmapped: 999424 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:20.301030+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:21.301347+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:22.301843+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:23.302267+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:24.302628+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:25.302990+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:26.303169+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:27.303479+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:28.303786+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:29.304083+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58753024 unmapped: 958464 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:30.305049+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:31.305403+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:32.306173+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58769408 unmapped: 942080 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:33.306367+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:34.306535+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58777600 unmapped: 933888 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:35.306755+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:36.306890+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:37.307045+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:38.307240+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:39.308074+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:40.308433+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:41.308648+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:42.308865+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:43.309082+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:44.309218+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:45.309347+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:46.309484+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:47.309722+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:48.309998+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:49.310204+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:50.310384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:51.310555+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:52.310691+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:53.310822+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:54.310972+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:55.311124+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:56.311362+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:57.311584+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:58.311811+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:59.312004+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:00.312303+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:01.312471+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:02.312588+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:03.312712+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:04.312904+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:05.313050+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:06.313196+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:07.313452+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:08.313663+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:09.313984+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:10.314126+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:11.314377+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:12.314521+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:13.314714+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58900480 unmapped: 811008 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:14.314848+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:15.315176+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:16.315354+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:17.315528+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:18.315704+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:19.315839+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:20.316014+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:21.316259+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:22.316447+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:23.316596+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:24.316722+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:25.316857+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:26.316986+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58941440 unmapped: 770048 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:27.317108+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:28.317307+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58949632 unmapped: 761856 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:29.317451+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58957824 unmapped: 753664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:30.317558+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:31.317665+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:32.317756+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:33.317874+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:34.317997+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:35.318151+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:36.318265+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:37.318434+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:38.318685+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:39.318852+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:40.319067+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:41.319262+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:42.319440+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:43.319567+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:44.319772+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:45.319894+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:46.320059+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:47.320263+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:48.320526+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:49.320681+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:50.320858+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:51.321021+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:52.321141+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:53.321275+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:54.321453+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:55.321628+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:56.321726+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:57.321880+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:58.322082+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:59.322309+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:00.322448+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:01.322578+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:02.322708+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:03.322830+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:04.322986+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:05.323151+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:06.323297+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:07.323419+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:08.323607+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:09.323753+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:10.323882+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:11.324020+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:12.324162+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:13.324299+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:14.324422+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:15.324559+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:16.324714+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:17.324947+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:18.325111+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:19.325234+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:20.325353+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:21.325503+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:22.325702+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:23.325837+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:24.326020+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:25.326204+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:26.326700+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:27.326905+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:28.327154+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:29.327377+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:30.327639+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:31.327896+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:32.328162+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:33.328347+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:34.328633+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:35.328895+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:36.329009+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:37.329199+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:38.329423+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:39.329880+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:40.330178+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:41.330392+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:42.330529+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:43.330668+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:44.330811+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:45.330996+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:46.331146+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:47.331319+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:48.331533+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:49.331751+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:50.331915+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:51.332172+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:52.332467+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:53.332778+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 15.87 MB, 0.03 MB/s
                                           Interval WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:54.333405+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:55.333598+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:56.333788+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:57.334062+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:58.334561+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:59.334735+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:00.334928+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:01.335094+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:02.335230+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:03.335372+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:04.335553+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:05.335710+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:06.335807+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:07.335961+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:08.336162+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:09.336316+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:10.336416+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:11.336558+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:12.336690+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:13.336818+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:14.336962+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:15.337107+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:16.337596+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:17.337767+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:18.337919+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:19.338153+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:20.338358+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:21.338537+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:22.338731+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:23.338894+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:24.339043+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:25.339259+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:26.339412+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:27.339547+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:28.339736+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:29.339887+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:30.340100+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:31.340360+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:32.340533+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:33.340688+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:34.340845+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:35.340986+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:36.341195+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:37.341381+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:38.341660+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:39.341826+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:40.341993+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:41.342469+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:42.342647+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:43.342825+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:44.342933+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:45.343157+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:46.343376+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:47.343607+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:48.343808+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:49.343963+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:50.344205+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:51.344392+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:52.344752+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:53.344892+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:54.345048+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:55.345398+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:56.345704+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:57.345863+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:58.346124+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:59.346363+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:00.346553+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:01.346708+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:02.346982+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:03.347154+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:04.347344+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:05.347522+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:06.347741+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:07.347890+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:08.348282+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:09.348477+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:10.348625+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:11.348773+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:12.348894+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:13.349048+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:14.349181+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:15.349380+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:16.349588+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:17.349710+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:18.349945+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:19.350096+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:20.350269+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:21.350437+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:22.350556+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:23.350689+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:24.350821+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:25.350965+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:26.351129+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:27.351225+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:28.351343+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:29.351461+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:30.351593+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:31.351717+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:32.351848+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:33.352015+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:34.352156+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:35.352308+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:36.352423+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:37.352541+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:38.352681+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:39.352861+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:40.353053+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:41.353209+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:42.353358+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:43.353537+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:44.353682+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:45.353870+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:46.354023+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:47.354268+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:48.354540+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:49.354733+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:50.354935+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:51.355053+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:52.355188+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:53.355394+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:54.355535+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:55.355668+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:56.355867+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:57.356017+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:58.356184+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:59.356322+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:00.356524+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:01.356661+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:02.356801+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:03.356953+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:04.357171+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:05.357382+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:06.357522+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:07.357682+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:08.357882+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:09.358032+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:10.358214+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:11.358356+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:12.358500+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:13.358696+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:14.358841+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:15.358978+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:16.359091+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:17.359229+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:18.359373+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:19.359489+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:20.359647+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:21.359794+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:22.359934+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:23.360066+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:24.360190+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:25.360339+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:26.360477+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:27.360621+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:28.360811+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:29.360975+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:30.361100+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:31.361233+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:32.361358+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:33.361511+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:34.361620+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:35.361739+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:36.361863+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:37.361991+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:38.362221+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:39.362394+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:40.362706+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:41.362848+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:42.362993+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:43.363165+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:44.363311+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:45.363434+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:46.363573+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:47.363703+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:48.364082+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:49.364285+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:50.364632+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:51.364811+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:52.364964+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:53.365091+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:54.365230+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:55.365354+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:56.365471+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:57.365603+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:58.366024+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:59.366224+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:00.366385+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:01.366509+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:02.366669+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:03.366845+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:04.367035+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:05.367210+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:06.367442+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:07.367617+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:08.367914+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:09.368119+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:10.368259+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:11.368423+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:12.368566+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:13.368747+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:14.368926+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:15.369068+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:16.369194+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:17.369366+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:18.369542+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:19.374683+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:20.374814+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:21.374934+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:22.375070+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:23.375232+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:24.375463+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:25.375608+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:26.375742+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:27.375917+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:28.376096+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:29.376275+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:30.376429+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:31.376590+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:32.376730+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:33.376871+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:34.376992+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:35.377121+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:36.377254+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:37.377423+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:38.377570+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:39.377732+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:40.378003+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:41.378170+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:42.378297+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:43.378420+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:44.378557+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:45.378708+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:46.379007+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:47.379328+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:48.379577+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:49.379980+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:50.380282+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:51.380724+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:52.380914+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:53.381066+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:54.381932+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:55.382049+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:56.382223+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:57.382361+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:58.382613+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:59.382742+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:00.382954+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:01.383096+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:02.383402+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:03.383548+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:04.383724+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:05.383871+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:06.384015+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:07.384166+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:08.384348+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:09.384593+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:10.384746+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:11.384985+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:12.385154+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:13.385336+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:14.385590+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:15.385802+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:16.385968+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:17.386148+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:18.386343+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:19.386525+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:20.386690+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:21.386858+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:22.386994+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:23.387135+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:24.387321+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:25.387476+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:26.387656+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:27.387807+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:28.388018+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:29.388261+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:30.388670+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:31.388878+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:32.389047+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:33.389209+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:34.389399+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:35.389536+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:36.389715+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:37.389849+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:38.390085+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:39.390215+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:40.390391+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:41.390563+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:42.390819+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:43.391007+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:44.391274+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:45.391466+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:46.391580+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:47.391726+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:48.391919+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:49.392045+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:50.392191+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:51.392333+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:52.392481+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:53.392668+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:54.392782+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:55.393090+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:56.393456+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:57.393685+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:58.394032+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:59.394340+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:00.394506+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:01.394728+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:02.394863+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:03.395090+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:04.395284+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: mgrc ms_handle_reset ms_handle_reset con 0x5595d67d3c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec 01 09:48:00 compute-0 ceph-osd[90166]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: get_auth_request con 0x5595d7ac8c00 auth_method 0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: mgrc handle_mgr_configure stats_period=5
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:05.395420+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15078 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:06.395630+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:07.395766+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:08.396007+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:09.396253+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:10.396481+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:11.396687+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:12.396902+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:13.397105+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:14.397308+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:15.397446+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:16.397624+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:17.397805+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:18.397986+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:19.398121+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:20.398259+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:21.398755+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:22.399363+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:23.401236+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:24.401918+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:25.402077+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:26.402304+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:27.402668+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:28.402798+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:29.402975+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:30.403226+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:31.403911+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:32.404059+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:33.404246+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:34.404434+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:35.404698+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:36.404814+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:37.404969+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:38.405230+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:39.405384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:40.405524+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:41.405696+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:42.405875+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:43.406114+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:44.406259+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:45.406460+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:46.406667+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:47.406794+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:48.407047+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:49.407241+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:50.407357+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:51.407456+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:52.407618+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:53.407803+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:54.408081+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:55.408395+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:56.408825+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:57.409173+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:58.409369+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:59.409539+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:00.409889+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 ms_handle_reset con 0x5595d6fa8c00 session 0x5595d72c4960
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9203c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:01.410095+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:02.410272+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:03.410490+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:04.410652+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:05.410808+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:06.410938+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:07.411086+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:08.411305+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:09.411482+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:10.411615+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:11.411777+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:12.411925+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:13.412088+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:14.412347+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:15.412533+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:16.412746+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:17.413136+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:18.413390+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:19.413630+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:20.413871+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:21.414107+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:22.414235+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:23.414360+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:24.414683+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:25.414834+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:26.415065+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:27.415222+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:28.415390+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:29.415598+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 1064960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:30.415744+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:31.415892+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:32.416031+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:33.416200+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:34.416402+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:35.416535+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:36.416824+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:37.417002+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:38.417197+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:39.417400+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:40.417568+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:41.417748+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 1056768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:42.417892+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:43.418074+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:44.418216+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:45.418562+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:46.418671+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:47.418832+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:48.418966+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:49.419160+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:50.419318+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:51.419465+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:52.419831+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:53.419997+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:54.420157+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:55.420309+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:56.420440+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:57.420628+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:58.422779+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:59.423061+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:00.423220+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:01.423433+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:02.423795+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:03.423955+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:04.424091+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:05.424257+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:06.424408+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:07.424551+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:08.424747+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:09.425585+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:10.425734+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:11.425944+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:12.426081+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:13.426692+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:14.427137+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:15.427354+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:16.427548+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:17.427676+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:18.427895+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:19.428236+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:20.428441+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:21.428593+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:22.428869+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:23.429017+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:24.429370+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:25.429679+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:26.429945+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:27.430261+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:28.430513+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:29.430700+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:30.430985+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:31.431117+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:32.431270+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:33.431470+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:34.431648+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:35.431820+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:36.431975+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:37.432128+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:38.432303+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:39.432714+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:40.432851+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:41.432957+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:42.433147+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:43.433258+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:44.433380+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:45.433558+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:46.433734+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:47.433958+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:48.434579+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:49.434832+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:50.436192+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:51.436434+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:52.436633+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:53.437048+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:54.437354+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:55.437558+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:56.437703+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:57.438081+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:58.438360+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:59.438611+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:00.438783+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:01.438920+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:02.439993+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:03.440413+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:04.444590+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:05.446066+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:06.446599+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:07.447112+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:08.447798+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:09.448741+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:10.448870+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:11.449081+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:12.449399+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:13.449658+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:14.450383+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:15.450713+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:16.451009+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:17.451260+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:18.451484+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:19.451812+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:20.452233+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:21.452571+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:22.452887+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:23.453122+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:24.453376+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:25.453687+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:26.453842+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:27.454039+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:28.454524+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:29.454854+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:30.455175+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:31.455346+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:32.455555+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:33.455749+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:34.455897+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:35.456048+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:36.456188+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:37.456352+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:38.456507+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:39.456674+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:40.456999+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:41.457264+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:42.458476+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:43.458714+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:44.459003+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:45.459395+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:46.459668+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:47.459934+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:48.460237+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:49.460460+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:50.460781+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:51.461022+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:52.461257+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:53.461661+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:54.461967+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:55.462198+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:56.462429+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:57.462851+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:58.463140+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:59.463458+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:00.463696+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:01.464000+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:02.464361+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:03.464583+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:04.464874+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:05.465069+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:06.465254+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:07.465502+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:08.465895+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:09.466145+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:10.466425+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:11.466634+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:12.466800+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:13.466956+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:14.467120+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:15.467405+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:16.467656+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:17.467872+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:18.468119+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:19.468418+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:20.468651+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:21.468810+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:22.468964+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:23.469124+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:24.469326+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:25.469561+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:26.469814+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:27.470040+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:28.470360+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:29.470562+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:30.470814+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:31.471027+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:32.471341+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:33.471639+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:34.471931+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:35.472171+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:36.472504+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:37.472807+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:38.473124+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:39.473433+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:40.473709+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:41.474053+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:42.474428+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 1048576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:43.474638+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:44.474988+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:45.475252+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:46.475516+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:47.475755+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:48.476083+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:49.476404+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:50.476650+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:51.476924+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:52.477204+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:53.477418+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Cumulative writes: 4162 writes, 19K keys, 4162 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4162 writes, 352 syncs, 11.82 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.035       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.055       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.4 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5595d598d1f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:54.477643+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:55.477878+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:56.478152+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:57.478371+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:58.478671+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:59.478880+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:00.479099+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:01.479545+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:02.479871+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:03.480116+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:04.480455+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:05.480811+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:06.481018+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:07.481227+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:08.481711+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:09.481935+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:10.482377+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:11.483271+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:12.483668+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:13.483954+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:14.484208+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:15.484518+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:16.484806+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:17.485036+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:18.485437+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:19.485718+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:20.486030+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:21.486544+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:22.486816+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:23.487082+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:24.487283+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:25.487478+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:26.487662+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:27.487794+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:28.488509+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:29.488687+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:30.488988+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:31.489189+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:32.489370+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:33.489640+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:34.489877+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:35.490047+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:36.490181+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:37.490530+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:38.490772+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:39.490972+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:40.491164+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:41.491340+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:42.491502+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:43.491661+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:44.491808+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:45.491970+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:46.492201+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:47.492394+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:48.492595+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:49.492755+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:50.492926+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:51.493275+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:52.493538+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:53.493753+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:54.494119+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:55.494396+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:56.494657+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:57.494843+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:58.495115+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:59.495382+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:00.495694+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:01.495970+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:02.496173+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:03.496482+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:04.496809+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:05.497061+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:06.497318+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:07.497524+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:08.497804+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:09.498095+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:10.498341+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:11.498567+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:12.498800+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:13.498970+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:14.499193+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:15.499487+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:16.499742+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:17.500030+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:18.500423+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:19.500836+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:20.501107+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:21.501432+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:22.501727+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:23.502269+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:24.502618+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:25.502852+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:26.503141+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:27.503455+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:28.503954+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:29.504167+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:30.504538+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:31.504812+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:32.505000+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:33.505222+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:34.505395+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:35.505579+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:36.505770+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:37.505989+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:38.506145+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:39.506384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:40.506679+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:41.506876+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:42.507068+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:43.507362+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:44.507572+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:45.507835+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:46.508207+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:47.508570+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x2e302/0x7c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:48.508856+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401474 data_alloc: 218103808 data_used: 20480
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:49.509158+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 47 handle_osd_map epochs [48,48], i have 47, src has [1,48]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1077.389038086s of 1077.396484375s, submitted: 2
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 48 heartbeat osd_stat(store_statfs(0x4fe14e000/0x0/0x4ffc00000, data 0x2f8bc/0x7f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:50.509377+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 48 handle_osd_map epochs [49,49], i have 48, src has [1,49]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 9977856 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:51.509753+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 49 handle_osd_map epochs [50,50], i have 49, src has [1,50]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 50 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61227008 unmapped: 8847360 heap: 70074368 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:52.509989+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 50 heartbeat osd_stat(store_statfs(0x4fd065000/0x0/0x4ffc00000, data 0x11124f6/0x1168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61538304 unmapped: 16932864 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:53.510477+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 50 handle_osd_map epochs [51,51], i have 50, src has [1,51]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 51 ms_handle_reset con 0x5595d8136c00 session 0x5595d6e51a40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:54.510823+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:55.511208+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:56.511466+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:57.511808+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:58.512097+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540377 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:59.512431+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:00.512719+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61579264 unmapped: 16891904 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:01.513385+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 51 heartbeat osd_stat(store_statfs(0x4fd061000/0x0/0x4ffc00000, data 0x1113acc/0x116b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 51 handle_osd_map epochs [52,52], i have 52, src has [1,52]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.469475746s of 11.739644051s, submitted: 54
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:02.513580+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:03.514081+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:04.514219+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:05.514421+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:06.514597+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:07.514759+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:08.515017+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:09.515175+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:10.515380+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:11.515573+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:12.515712+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:13.515896+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:14.516069+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:15.516272+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:16.516544+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:17.516791+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:18.516994+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:19.517135+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:20.517401+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f6c/0x116e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:21.518043+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:22.519084+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:23.519675+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 542165 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:24.520190+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:25.520856+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61595648 unmapped: 16875520 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.064493179s of 24.076759338s, submitted: 13
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 52 heartbeat osd_stat(store_statfs(0x4fd05f000/0x0/0x4ffc00000, data 0x1114f8f/0x116f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137000 session 0x5595d8062f00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:26.521168+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61677568 unmapped: 16793600 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 53 ms_handle_reset con 0x5595d8137800 session 0x5595d81a34a0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:27.521480+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:28.521995+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 61661184 unmapped: 16809984 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 53 heartbeat osd_stat(store_statfs(0x4fd05b000/0x0/0x4ffc00000, data 0x1116549/0x1172000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [53,54], i have 53, src has [1,54]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 53 handle_osd_map epochs [54,54], i have 54, src has [1,54]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 555402 data_alloc: 218103808 data_used: 45056
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d7ac9400 session 0x5595d81a3e00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:29.522364+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 15663104 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137000 session 0x5595d80623c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 54 ms_handle_reset con 0x5595d8137400 session 0x5595d6e50f00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:30.522716+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 15646720 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 54 heartbeat osd_stat(store_statfs(0x4fd057000/0x0/0x4ffc00000, data 0x1117b13/0x1176000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 54 handle_osd_map epochs [54,55], i have 54, src has [1,55]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:31.522872+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 55 ms_handle_reset con 0x5595d8137c00 session 0x5595d6e512c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:32.523137+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 62988288 unmapped: 15482880 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:33.523263+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 56 ms_handle_reset con 0x5595d9b9e800 session 0x5595d80921e0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 14262272 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 56 heartbeat osd_stat(store_statfs(0x4fd055000/0x0/0x4ffc00000, data 0x1119111/0x1178000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 559201 data_alloc: 218103808 data_used: 45056
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:34.523646+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 14229504 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 56 handle_osd_map epochs [56,57], i have 56, src has [1,57]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,57], i have 57, src has [1,57]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:35.523858+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 57 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8075680
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:36.524199+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 14049280 heap: 78471168 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.944048882s of 11.445683479s, submitted: 143
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9ec00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [57,58], i have 57, src has [1,58]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 57 handle_osd_map epochs [58,58], i have 58, src has [1,58]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:37.524362+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729eb40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 58 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a52c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 21086208 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [58,59], i have 58, src has [1,59]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 58 handle_osd_map epochs [59,59], i have 59, src has [1,59]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:38.524523+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 59 ms_handle_reset con 0x5595d8137c00 session 0x5595d7b001e0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 20922368 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 59 heartbeat osd_stat(store_statfs(0x4fc84b000/0x0/0x4ffc00000, data 0x191d2db/0x1982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854268 data_alloc: 218103808 data_used: 61440
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:39.524726+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8137400 session 0x5595d7bae780
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 60 ms_handle_reset con 0x5595d8136c00 session 0x5595d80785a0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 20774912 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 60 handle_osd_map epochs [60,61], i have 60, src has [1,61]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 61 ms_handle_reset con 0x5595d8137400 session 0x5595d73d83c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:40.524886+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 61 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x391fecd/0x398a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66355200 unmapped: 20512768 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d8137c00 session 0x5595d8197860
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:41.525251+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 62 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729e960
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 20258816 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9ec00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d8137000 session 0x5595d729f680
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9e800 session 0x5595d81963c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 63 heartbeat osd_stat(store_statfs(0x4fc83a000/0x0/0x4ffc00000, data 0x11237d9/0x1192000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 63 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d80a4b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:42.525352+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 63 handle_osd_map epochs [63,64], i have 63, src has [1,64]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 64 heartbeat osd_stat(store_statfs(0x4fc426000/0x0/0x4ffc00000, data 0x1124dd4/0x1194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:43.525472+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 18751488 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 64 handle_osd_map epochs [64,65], i have 64, src has [1,65]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b16d20
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8079a40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 65 ms_handle_reset con 0x5595d8137400 session 0x5595d8196b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 613840 data_alloc: 218103808 data_used: 122880
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:44.525631+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 65 heartbeat osd_stat(store_statfs(0x4fc422000/0x0/0x4ffc00000, data 0x11263ba/0x1197000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 18694144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:45.525796+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 66 ms_handle_reset con 0x5595d8137400 session 0x5595d729f680
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 18743296 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [66,67], i have 66, src has [1,67]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 66 handle_osd_map epochs [67,67], i have 67, src has [1,67]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:46.525961+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 67 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8092b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 18735104 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 67 handle_osd_map epochs [67,68], i have 67, src has [1,68]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.158089638s of 10.213048935s, submitted: 251
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:47.526078+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 18481152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 68 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01680
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9ec00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9e800 session 0x5595d82c2f00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:48.526267+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 69 ms_handle_reset con 0x5595d9b9ec00 session 0x5595d729e3c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 18423808 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fcc19000/0x0/0x4ffc00000, data 0x112ce2f/0x11a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 630316 data_alloc: 218103808 data_used: 139264
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:49.526408+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d7ac9400 session 0x5595d729eb40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 70 ms_handle_reset con 0x5595d8136c00 session 0x5595d729fc20
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 18243584 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:50.526534+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137400 session 0x5595d8196f00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:51.526689+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:52.526846+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d9b9e800 session 0x5595d73e14a0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d8137c00 session 0x5595d73e1680
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fcc12000/0x0/0x4ffc00000, data 0x112f364/0x11a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:53.527043+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 18169856 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 71 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d83c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:54.527279+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 634276 data_alloc: 218103808 data_used: 139264
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [72,73], i have 72, src has [1,73]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 72 handle_osd_map epochs [73,73], i have 73, src has [1,73]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 73 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d8f00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:55.527505+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fcc10000/0x0/0x4ffc00000, data 0x1130982/0x11ad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68771840 unmapped: 18096128 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8137400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9e800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 73 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x113204f/0x11b0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:56.527653+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:57.527796+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:58.527977+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68788224 unmapped: 18079744 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.005791664s of 11.874329567s, submitted: 231
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:59.528100+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 74 ms_handle_reset con 0x5595d9b9c800 session 0x5595d739d860
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 640656 data_alloc: 218103808 data_used: 151552
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 17956864 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 74 handle_osd_map epochs [74,75], i have 74, src has [1,75]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c400 session 0x5595d81974a0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:00.528373+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 17948672 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:01.528514+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d9b9c000 session 0x5595d80a4d20
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 17915904 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fba64000/0x0/0x4ffc00000, data 0x1134c7d/0x11b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 75 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7d3a780
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:02.528707+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 75 handle_osd_map epochs [75,76], i have 75, src has [1,76]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 76 ms_handle_reset con 0x5595d8136c00 session 0x5595d72b8b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 17793024 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:03.528915+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 17768448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 77 ms_handle_reset con 0x5595d9b9c000 session 0x5595d729e3c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:04.529101+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 654076 data_alloc: 218103808 data_used: 155648
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 17612800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 78 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:05.529362+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:06.529563+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:07.529916+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 16621568 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c800 session 0x5595d72c52c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:08.530170+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d92c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d8197680
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c400 session 0x5595d729e960
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d72ad000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d72ad000 session 0x5595d80a5a40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d7ac9400 session 0x5595d80a4960
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d8136c00 session 0x5595d7b01e00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:09.530374+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 665586 data_alloc: 218103808 data_used: 172032
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 16613376 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 ms_handle_reset con 0x5595d9b9c000 session 0x5595d81a3a40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 79 handle_osd_map epochs [79,80], i have 79, src has [1,80]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.607955933s of 11.038110733s, submitted: 98
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b16b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7348800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:10.531083+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d7348800 session 0x5595d7b165a0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba59000/0x0/0x4ffc00000, data 0x113a3c8/0x11c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:11.531245+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 80 heartbeat osd_stat(store_statfs(0x4fba57000/0x0/0x4ffc00000, data 0x113b8a0/0x11c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:12.531434+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 16662528 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:13.531602+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:14.531757+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 667454 data_alloc: 218103808 data_used: 172032
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 16646144 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 80 ms_handle_reset con 0x5595d9b9c000 session 0x5595d7bae3c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:15.531910+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 16637952 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9b9c400 session 0x5595d7b00780
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598400 session 0x5595d8062f00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d7b00f00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:16.532072+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598800
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598800 session 0x5595d80743c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 16588800 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 81 ms_handle_reset con 0x5595d9598000 session 0x5595d72b90e0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:17.532218+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 82 heartbeat osd_stat(store_statfs(0x4fba54000/0x0/0x4ffc00000, data 0x113ce5a/0x11c9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70328320 unmapped: 16539648 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598400 session 0x5595d73e0d20
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:18.532384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 16515072 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba50000/0x0/0x4ffc00000, data 0x113e468/0x11cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c000 session 0x5595d739d0e0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9b9c400 session 0x5595d6de4d20
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598c00 session 0x5595d7b16b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:19.532537+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 679530 data_alloc: 218103808 data_used: 172032
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 16531456 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.183552742s of 10.257410049s, submitted: 36
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:20.532701+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 ms_handle_reset con 0x5595d9598000 session 0x5595d7b165a0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:21.532815+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 heartbeat osd_stat(store_statfs(0x4fba4e000/0x0/0x4ffc00000, data 0x113fa80/0x11d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 16490496 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:22.532949+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d9598400 session 0x5595d72c52c0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:23.533124+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d7ac9400 session 0x5595d7b170e0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 84 ms_handle_reset con 0x5595d8136c00 session 0x5595d73d9860
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9b9c000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:24.533342+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 680355 data_alloc: 218103808 data_used: 192512
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 16482304 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 84 handle_osd_map epochs [84,85], i have 84, src has [1,85]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 84 heartbeat osd_stat(store_statfs(0x4fba4c000/0x0/0x4ffc00000, data 0x1141058/0x11d1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:25.533523+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _renew_subs
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:26.533639+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9c000 session 0x5595d8074b40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 86 heartbeat osd_stat(store_statfs(0x4fba45000/0x0/0x4ffc00000, data 0x1143b22/0x11d7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:27.533768+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 16465920 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:28.533925+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d8137400 session 0x5595d73d8780
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d9b9e800 session 0x5595d8196960
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:29.534654+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d7ac9400
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 687015 data_alloc: 218103808 data_used: 192512
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 86 ms_handle_reset con 0x5595d7ac9400 session 0x5595d73d9a40
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 16433152 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:30.534825+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 16498688 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:31.534987+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d8136c00
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.584367752s of 11.068427086s, submitted: 87
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 87 ms_handle_reset con 0x5595d8136c00 session 0x5595d80634a0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 87 heartbeat osd_stat(store_statfs(0x4fba43000/0x0/0x4ffc00000, data 0x1144fde/0x11da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 16449536 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 88 ms_handle_reset con 0x5595d9598000 session 0x5595d7bae1e0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:32.535137+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:33.535277+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:34.535488+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690967 data_alloc: 218103808 data_used: 196608
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:35.535630+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 16441344 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:36.535770+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x11465ba/0x11dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:37.537730+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:38.538731+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:39.538886+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 693939 data_alloc: 218103808 data_used: 196608
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:40.539066+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:41.539214+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 16367616 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:42.539347+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.918384552s of 11.027014732s, submitted: 76
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3e000/0x0/0x4ffc00000, data 0x1147a76/0x11df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 16359424 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:43.539501+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:44.539696+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:45.539866+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:46.540011+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:47.540173+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:48.540363+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:49.540526+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:50.540696+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:51.540830+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:52.540963+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:53.541160+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:54.541360+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:55.541502+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:56.541679+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:57.541836+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:58.541989+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:59.542319+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:00.542477+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:01.542645+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:02.542784+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:03.542920+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:04.543106+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:05.543219+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:06.543413+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:07.543604+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:08.543947+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:09.544098+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:10.544250+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:11.544445+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:12.544582+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:13.544815+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:14.545104+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:15.545265+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:16.545459+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:17.545617+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 16326656 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:18.545943+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:19.546103+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:20.546304+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:21.546477+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:22.546653+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:23.546796+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:24.546975+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:25.547105+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:26.547328+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:27.547600+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:28.547872+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:29.548026+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:30.548431+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:31.548787+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:32.549062+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:33.549277+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:34.549485+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:35.549635+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:36.549924+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:37.550209+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:38.550520+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:39.550831+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 16318464 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:40.551029+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:41.551772+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:42.551946+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:43.552129+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:44.552781+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:45.553125+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:46.553508+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:47.553720+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:48.553972+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:49.554182+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:50.554369+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:51.554527+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:52.554655+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:53.554773+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 16310272 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:54.554906+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config show' '{prefix=config show}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 16056320 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:55.555040+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 15605760 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:56.555203+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 15720448 heap: 86867968 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:57.555361+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 26763264 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'perf dump' '{prefix=perf dump}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:58.555508+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'perf schema' '{prefix=perf schema}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:59.555646+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:00.555814+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:01.556004+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:02.556250+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:03.556540+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:04.556723+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:05.556858+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:06.557099+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:07.557300+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:08.557567+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:09.557828+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:10.557999+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:11.558141+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:12.558378+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:13.558604+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:14.558969+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:15.580216+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:16.580389+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:17.580551+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:18.580744+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:19.580927+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:20.581107+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:21.581304+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:22.581499+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:23.581690+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:24.581898+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:25.582033+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:26.582150+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:27.582300+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:28.582456+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:29.582636+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:30.582778+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:31.582931+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:32.583088+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:33.583222+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:34.583363+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:35.583541+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:36.583731+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:37.583956+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:38.584157+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:39.584372+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:40.584688+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:41.584870+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:42.585070+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:43.585422+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:44.585583+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:45.585760+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:46.585928+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:47.586127+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:48.586436+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:49.586610+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:50.586787+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:51.586919+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:52.587073+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:53.587255+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:54.587374+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:55.587531+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:56.587750+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:57.587912+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:58.588128+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:59.588348+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:00.588562+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:01.588687+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:02.588838+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:03.589015+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:04.589167+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:05.589398+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:06.589554+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:07.589720+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:08.590055+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:09.590874+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:10.591040+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:11.591243+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 26624000 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:12.591391+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:13.591531+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:14.591696+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:15.591893+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:16.592071+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:17.592204+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:18.592397+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:19.592509+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:20.592622+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:21.592798+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:22.592937+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:23.593106+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:24.593243+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:25.593366+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:26.593567+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:27.593745+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:28.593920+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:29.594048+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:30.594221+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:31.594408+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:32.594553+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:33.594723+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:34.594873+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:35.595012+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:36.595133+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:37.595514+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:38.595751+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:39.595942+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:40.596097+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:41.596258+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:42.596384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:43.596576+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:44.596731+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:45.596917+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:46.597102+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:47.597281+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:48.597567+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:49.597737+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 26615808 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:50.597920+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:51.598042+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:52.598254+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:53.598413+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:54.598596+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:55.598777+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:56.598932+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:57.599102+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:58.742271+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:59.742497+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/361956077' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:00.742685+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:01.742885+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:02.743117+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:03.743335+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:04.743498+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:05.743662+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:06.743868+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:07.744015+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:08.744269+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:09.744514+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:10.744698+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:11.744863+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:12.745009+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:13.745199+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:14.745377+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:15.745630+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:16.745832+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:17.745987+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:18.746219+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:19.746386+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:20.746502+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:21.746654+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:22.746831+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:23.747052+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:24.747244+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:25.747469+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:26.747684+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:27.747913+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:28.748136+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:29.748538+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 26607616 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:30.748751+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:31.749038+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:32.749231+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:33.749379+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:34.749532+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:35.749731+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:36.749877+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:37.750008+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:38.750191+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:39.750366+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:40.750513+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:41.750961+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:42.751169+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:43.751515+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:44.751744+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:45.752001+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:46.752237+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:47.752549+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:48.752832+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:49.753034+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:50.753214+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:51.753417+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:52.753736+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:53.753942+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:54.754121+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:55.754322+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:56.754602+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:57.754919+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:58.755196+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:59.755413+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:00.755702+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:01.755968+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:02.800346+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:03.800525+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:04.800986+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:05.801146+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:06.801368+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:07.801549+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:08.801829+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:09.802120+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:10.802284+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:11.802486+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:12.802617+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:13.802794+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:14.802941+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:15.803103+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:16.803355+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:17.803543+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:18.803770+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:19.803903+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:20.804045+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:21.804218+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:22.804380+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:23.804557+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:24.804720+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:25.804882+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:26.805016+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:27.805234+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:28.805356+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:29.805517+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:30.805695+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:31.805854+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:32.805979+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:33.806184+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:34.806388+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:35.806614+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:36.806788+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:37.806967+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:38.807188+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:39.807430+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:40.807687+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:41.807832+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:42.808022+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:43.808222+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:44.808404+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:45.808569+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:46.808793+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:47.808947+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:48.809142+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:49.809366+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:50.809560+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:51.809749+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:52.810009+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:53.810176+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:54.810390+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:55.810594+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:56.810817+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:57.811054+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:58.811460+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:59.811666+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:00.811827+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:01.812018+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:02.812241+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:03.812462+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:04.812659+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:05.812862+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:06.813001+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:07.813134+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:08.813369+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:09.813547+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:10.813715+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:11.813879+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:12.814098+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:13.815535+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:14.815740+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:15.815939+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:16.816113+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:17.816441+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:18.816739+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:19.816947+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:20.817122+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:21.817341+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:22.817513+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:23.817739+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:24.817904+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:25.818079+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:26.818230+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:27.818383+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:28.818604+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:29.818811+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:30.819001+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:31.819184+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:32.819516+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:33.819672+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:34.819837+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:35.820064+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:36.820282+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:37.820479+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:38.820658+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:39.820791+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:40.820966+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:41.821580+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:42.821777+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:43.821969+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:44.822140+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:45.822342+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:46.822538+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:47.822704+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:48.823006+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:49.823176+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:50.823385+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:51.823583+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:52.823727+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:53.823859+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.4 total, 600.0 interval
                                           Cumulative writes: 5984 writes, 24K keys, 5984 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5984 writes, 1172 syncs, 5.11 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1822 writes, 4802 keys, 1822 commit groups, 1.0 writes per commit group, ingest: 2.46 MB, 0.00 MB/s
                                           Interval WAL: 1822 writes, 820 syncs, 2.22 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:54.824037+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:55.824208+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:56.824328+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:57.824530+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:58.824796+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:59.824978+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:00.825132+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:01.825473+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:02.825636+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:03.825808+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:04.825979+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:05.826168+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:06.826460+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:07.826877+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:08.827198+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:09.827393+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:10.827606+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:11.827820+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:12.827985+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:13.828710+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:14.828894+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:15.829099+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:16.829372+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:17.829539+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:18.829783+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:19.829975+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:20.830139+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:21.830342+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 26599424 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:22.830492+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:23.830707+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:24.830913+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:25.831116+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:26.831379+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:27.832105+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:28.832441+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:29.832618+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:30.832788+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:31.832960+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:32.833517+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:33.833693+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:34.833831+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:35.833968+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:36.834206+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:37.834379+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:38.834707+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:39.834869+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:40.834984+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:41.835113+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:42.835579+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:43.835823+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:44.836117+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:45.836263+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:46.836416+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:47.836552+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:48.837261+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:49.837426+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:50.837588+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:51.837757+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:52.837910+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:53.838082+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:54.838333+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:55.838553+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:56.838765+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:57.839039+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:58.839316+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 26591232 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:59.839615+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:00.839799+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 ms_handle_reset con 0x5595d9203c00 session 0x5595d6ef6000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: handle_auth_request added challenge on 0x5595d9598000
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:01.840023+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:02.840332+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:03.840541+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:04.840706+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:05.840926+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:06.841154+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:07.841404+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:08.841676+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:09.841827+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:10.842009+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:11.842187+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:12.842402+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:13.842594+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:14.842713+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:15.842925+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:16.843105+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:17.843415+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:18.843747+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:19.843987+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:20.844230+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:21.844387+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:22.844628+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:23.844850+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:24.845127+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:25.845384+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:26.845627+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:27.845895+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:28.846208+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:29.846458+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:30.846623+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:31.846847+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:32.847100+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:33.847256+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:34.847415+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:35.847674+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:36.848021+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:37.848195+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:38.848560+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:39.848772+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:40.848983+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:41.849217+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:42.849444+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:43.849667+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:44.849856+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:45.850057+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:46.850478+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:47.850666+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:48.850981+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:49.851164+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:50.851402+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:51.851608+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:52.851770+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:53.851911+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:54.852047+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:55.852246+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:56.852404+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:57.852601+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:58.852817+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:59.853035+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:00.853809+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:01.855029+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:02.855539+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:03.856088+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:04.856528+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:05.857196+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 26583040 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:06.857520+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:07.857771+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:08.858356+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:09.858690+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:10.859249+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:11.859697+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:12.859921+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:13.860329+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:14.860549+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:15.861077+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:16.861236+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:17.861453+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:18.861807+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:19.862046+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:20.862409+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:21.862700+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:22.862930+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:23.863224+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:24.863428+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:25.863577+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:26.863781+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:27.863977+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:28.864197+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:29.864348+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:30.864506+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:31.864722+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:32.864912+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:33.865191+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:34.865408+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:35.865550+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:36.865714+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:37.865962+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:38.866341+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:39.866602+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:40.866961+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:41.867138+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:42.867396+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:43.867556+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:44.867779+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:45.868025+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:46.868212+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:47.868398+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:48.868747+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:49.868903+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:50.869100+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:51.869377+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:52.869568+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:53.869770+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:54.869953+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:55.870114+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:56.870276+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:57.870539+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:58.870845+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:59.871025+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:00.871560+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:01.871724+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:02.871969+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:03.872142+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:04.872362+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:05.873687+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:06.873842+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:07.874923+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:08.879182+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:09.880154+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 26574848 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:10.880552+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:11.880900+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:12.881330+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:13.881880+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:14.882038+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:15.882160+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:16.882376+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:17.882512+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:18.882699+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:19.882977+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:20.883339+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:21.883542+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:22.883807+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:23.883970+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:24.884119+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:25.884270+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:26.884433+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:00 compute-0 ceph-osd[90166]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:00 compute-0 ceph-osd[90166]: bluestore.MempoolThread(0x5595d5a6bb60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698671 data_alloc: 218103808 data_used: 241664
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 26566656 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:27.884748+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config show' '{prefix=config show}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 26353664 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:28.884957+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 26198016 heap: 97910784 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: tick
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_tickets
Dec 01 09:48:00 compute-0 ceph-osd[90166]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:29.885082+0000)
Dec 01 09:48:00 compute-0 ceph-osd[90166]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fba3b000/0x0/0x4ffc00000, data 0x1148f16/0x11e2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Dec 01 09:48:00 compute-0 ceph-osd[90166]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:48:00 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:48:01 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15082 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Dec 01 09:48:01 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2187297463' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: from='client.15075 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3252037160' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: from='client.15078 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/361956077' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: from='client.15082 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2187297463' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15086 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Dec 01 09:48:01 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3612791590' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15090 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:01 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1125: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:02 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Dec 01 09:48:02 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1125510605' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 09:48:02 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15094 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:02 compute-0 ceph-mon[75031]: from='client.15086 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:02 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3612791590' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 09:48:02 compute-0 ceph-mon[75031]: from='client.15090 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:02 compute-0 ceph-mon[75031]: pgmap v1125: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:02 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1125510605' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 09:48:02 compute-0 crontab[278425]: (root) LIST (root)
Dec 01 09:48:02 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15100 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:02 compute-0 ceph-5620a9fb-e540-5250-a0e8-7aaad5347e3b-mgr-compute-0-psduho[75320]: 2025-12-01T09:48:02.938+0000 7fd2d6503640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:48:02 compute-0 ceph-mgr[75324]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 01 09:48:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "node ls"} v 0) v1
Dec 01 09:48:03 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1879242237' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mon[75031]: from='client.15094 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mon[75031]: from='client.15100 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1879242237' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Dec 01 09:48:03 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1499303170' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Dec 01 09:48:03 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2037774809' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Dec 01 09:48:03 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4141685720' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Dec 01 09:48:03 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/413353018' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 09:48:03 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1126: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Dec 01 09:48:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1409976490' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1499303170' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2037774809' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4141685720' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/413353018' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: pgmap v1126: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:04 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1409976490' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Dec 01 09:48:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3574915099' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Dec 01 09:48:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1136288052' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Dec 01 09:48:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1232343839' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 09:48:04 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Dec 01 09:48:04 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4113344786' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Dec 01 09:48:05 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2188931726' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3574915099' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1136288052' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1232343839' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4113344786' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2188931726' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 53) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:37.135186+0000 osd.1 (osd.1) 52 : cluster [DBG] 5.18 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:37.149358+0000 osd.1 (osd.1) 53 : cluster [DBG] 5.18 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:08.811735+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:38.109588+0000 osd.1 (osd.1) 54 : cluster [DBG] 5.19 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:38.123475+0000 osd.1 (osd.1) 55 : cluster [DBG] 5.19 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 1327104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 55) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:38.109588+0000 osd.1 (osd.1) 54 : cluster [DBG] 5.19 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:38.123475+0000 osd.1 (osd.1) 55 : cluster [DBG] 5.19 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:09.811997+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:10.812103+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383751 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:11.812233+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:12.812353+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:42.069793+0000 osd.1 (osd.1) 56 : cluster [DBG] 5.1a scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:42.083790+0000 osd.1 (osd.1) 57 : cluster [DBG] 5.1a scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 57) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:42.069793+0000 osd.1 (osd.1) 56 : cluster [DBG] 5.1a scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:42.083790+0000 osd.1 (osd.1) 57 : cluster [DBG] 5.1a scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:13.812588+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58417152 unmapped: 1294336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:14.812723+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:44.098536+0000 osd.1 (osd.1) 58 : cluster [DBG] 5.1d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:44.112654+0000 osd.1 (osd.1) 59 : cluster [DBG] 5.1d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 59) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:44.098536+0000 osd.1 (osd.1) 58 : cluster [DBG] 5.1d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:44.112654+0000 osd.1 (osd.1) 59 : cluster [DBG] 5.1d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:15.812922+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:16.813058+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:17.813199+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:18.813336+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:19.813580+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:20.813768+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386047 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:21.813927+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.945797920s of 14.976054192s, submitted: 8
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.f deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:22.814097+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:52.111194+0000 osd.1 (osd.1) 60 : cluster [DBG] 5.f deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:52.125331+0000 osd.1 (osd.1) 61 : cluster [DBG] 5.f deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58458112 unmapped: 1253376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 61) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:52.111194+0000 osd.1 (osd.1) 60 : cluster [DBG] 5.f deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:52.125331+0000 osd.1 (osd.1) 61 : cluster [DBG] 5.f deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:23.814574+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:24.814703+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:54.106839+0000 osd.1 (osd.1) 62 : cluster [DBG] 2.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:54.120955+0000 osd.1 (osd.1) 63 : cluster [DBG] 2.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 63) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:54.106839+0000 osd.1 (osd.1) 62 : cluster [DBG] 2.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:54.120955+0000 osd.1 (osd.1) 63 : cluster [DBG] 2.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:25.814966+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388341 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:26.815118+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:56.110650+0000 osd.1 (osd.1) 64 : cluster [DBG] 4.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:56.124687+0000 osd.1 (osd.1) 65 : cluster [DBG] 4.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58482688 unmapped: 1228800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 65) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:56.110650+0000 osd.1 (osd.1) 64 : cluster [DBG] 4.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:56.124687+0000 osd.1 (osd.1) 65 : cluster [DBG] 4.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:27.815382+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:57.076603+0000 osd.1 (osd.1) 66 : cluster [DBG] 6.c scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:57.090754+0000 osd.1 (osd.1) 67 : cluster [DBG] 6.c scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 67) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:57.076603+0000 osd.1 (osd.1) 66 : cluster [DBG] 6.c scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:57.090754+0000 osd.1 (osd.1) 67 : cluster [DBG] 6.c scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:28.815575+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58490880 unmapped: 1220608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:29.815772+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:59.068688+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.6 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:16:59.082801+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.6 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58499072 unmapped: 1212416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 69) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:59.068688+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.6 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:16:59.082801+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.6 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:30.815924+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 391782 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:31.816053+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:01.076602+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.c deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:01.090715+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.c deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 71) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:01.076602+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.c deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:01.090715+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.c deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:32.816345+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:02.080714+0000 osd.1 (osd.1) 72 : cluster [DBG] 6.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:02.098382+0000 osd.1 (osd.1) 73 : cluster [DBG] 6.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.828042984s of 10.924718857s, submitted: 14
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.1 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 1187840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 73) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:02.080714+0000 osd.1 (osd.1) 72 : cluster [DBG] 6.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:02.098382+0000 osd.1 (osd.1) 73 : cluster [DBG] 6.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:33.816601+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:03.035844+0000 osd.1 (osd.1) 74 : cluster [DBG] 5.1 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:03.049961+0000 osd.1 (osd.1) 75 : cluster [DBG] 5.1 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58531840 unmapped: 1179648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 75) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:03.035844+0000 osd.1 (osd.1) 74 : cluster [DBG] 5.1 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:03.049961+0000 osd.1 (osd.1) 75 : cluster [DBG] 5.1 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:34.816776+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:04.067468+0000 osd.1 (osd.1) 76 : cluster [DBG] 4.f scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:04.081451+0000 osd.1 (osd.1) 77 : cluster [DBG] 4.f scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:35.816966+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 4 last_log 79 sent 77 num 4 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:05.018694+0000 osd.1 (osd.1) 78 : cluster [DBG] 2.5 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:05.032744+0000 osd.1 (osd.1) 79 : cluster [DBG] 2.5 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 77) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:04.067468+0000 osd.1 (osd.1) 76 : cluster [DBG] 4.f scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:04.081451+0000 osd.1 (osd.1) 77 : cluster [DBG] 4.f scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:36.817198+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 4 last_log 81 sent 79 num 4 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:06.053944+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.7 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:06.068056+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.7 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 79) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:05.018694+0000 osd.1 (osd.1) 78 : cluster [DBG] 2.5 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:05.032744+0000 osd.1 (osd.1) 79 : cluster [DBG] 2.5 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 81) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:06.053944+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.7 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:06.068056+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.7 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:37.817406+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:38.817549+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58572800 unmapped: 1138688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:39.817711+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:40.817843+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398664 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:41.817981+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:42.818108+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:43.818234+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.857633591s of 11.003772736s, submitted: 8
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:44.818367+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:14.039840+0000 osd.1 (osd.1) 82 : cluster [DBG] 2.3 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:14.053886+0000 osd.1 (osd.1) 83 : cluster [DBG] 2.3 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 83) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:14.039840+0000 osd.1 (osd.1) 82 : cluster [DBG] 2.3 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:14.053886+0000 osd.1 (osd.1) 83 : cluster [DBG] 2.3 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:45.818547+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 399811 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:46.818676+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:47.818807+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:48.818932+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:18.086898+0000 osd.1 (osd.1) 84 : cluster [DBG] 2.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:18.101033+0000 osd.1 (osd.1) 85 : cluster [DBG] 2.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 85) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:18.086898+0000 osd.1 (osd.1) 84 : cluster [DBG] 2.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:18.101033+0000 osd.1 (osd.1) 85 : cluster [DBG] 2.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58621952 unmapped: 1089536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:49.819162+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:50.819316+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400958 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58630144 unmapped: 1081344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:51.819466+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:52.819747+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:53.819890+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:23.171732+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:23.185809+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 87) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:23.171732+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:23.185809+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:54.820058+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58654720 unmapped: 1056768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:55.820440+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 402105 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.033282280s of 12.157509804s, submitted: 6
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:56.820634+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:26.197393+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:26.211663+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 89) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:26.197393+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:26.211663+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:57.820940+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58687488 unmapped: 1024000 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:58.821084+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:59.821321+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58695680 unmapped: 1015808 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:00.821445+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:30.174708+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.15 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:30.188775+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.15 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 404401 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 91) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:30.174708+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.15 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:30.188775+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.15 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:01.821661+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:02.821775+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:32.221876+0000 osd.1 (osd.1) 92 : cluster [DBG] 5.12 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:32.235942+0000 osd.1 (osd.1) 93 : cluster [DBG] 5.12 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 93) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:32.221876+0000 osd.1 (osd.1) 92 : cluster [DBG] 5.12 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:32.235942+0000 osd.1 (osd.1) 93 : cluster [DBG] 5.12 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:03.821923+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:04.822244+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58728448 unmapped: 983040 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:05.822485+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 405549 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:06.822734+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:07.823238+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857577324s of 11.936425209s, submitted: 6
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:08.823446+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:38.133696+0000 osd.1 (osd.1) 94 : cluster [DBG] 5.13 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:38.147872+0000 osd.1 (osd.1) 95 : cluster [DBG] 5.13 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 95) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:38.133696+0000 osd.1 (osd.1) 94 : cluster [DBG] 5.13 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:38.147872+0000 osd.1 (osd.1) 95 : cluster [DBG] 5.13 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:09.823668+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58785792 unmapped: 925696 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:10.823848+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:40.178448+0000 osd.1 (osd.1) 96 : cluster [DBG] 2.17 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:40.192812+0000 osd.1 (osd.1) 97 : cluster [DBG] 2.17 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 97) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:40.178448+0000 osd.1 (osd.1) 96 : cluster [DBG] 2.17 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:40.192812+0000 osd.1 (osd.1) 97 : cluster [DBG] 2.17 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 407845 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:11.824081+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:12.824249+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58793984 unmapped: 917504 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:13.824410+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:14.824583+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58810368 unmapped: 901120 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:15.824732+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 408992 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:16.824909+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:46.183927+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:46.198038+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 99) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:46.183927+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:46.198038+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:17.825128+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.989212036s of 10.056298256s, submitted: 6
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58834944 unmapped: 876544 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:18.825277+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:48.190144+0000 osd.1 (osd.1) 100 : cluster [DBG] 2.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:48.204241+0000 osd.1 (osd.1) 101 : cluster [DBG] 2.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 101) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:48.190144+0000 osd.1 (osd.1) 100 : cluster [DBG] 2.d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:48.204241+0000 osd.1 (osd.1) 101 : cluster [DBG] 2.d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:19.825736+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:20.825898+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411286 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:21.826066+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:51.191890+0000 osd.1 (osd.1) 102 : cluster [DBG] 6.6 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:51.209490+0000 osd.1 (osd.1) 103 : cluster [DBG] 6.6 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 103) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:51.191890+0000 osd.1 (osd.1) 102 : cluster [DBG] 6.6 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:51.209490+0000 osd.1 (osd.1) 103 : cluster [DBG] 6.6 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:22.826342+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:52.230020+0000 osd.1 (osd.1) 104 : cluster [DBG] 4.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:52.244120+0000 osd.1 (osd.1) 105 : cluster [DBG] 4.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 105) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:52.230020+0000 osd.1 (osd.1) 104 : cluster [DBG] 4.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:52.244120+0000 osd.1 (osd.1) 105 : cluster [DBG] 4.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:23.826560+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:53.232235+0000 osd.1 (osd.1) 106 : cluster [DBG] 6.1 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:53.246302+0000 osd.1 (osd.1) 107 : cluster [DBG] 6.1 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 107) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:53.232235+0000 osd.1 (osd.1) 106 : cluster [DBG] 6.1 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:53.246302+0000 osd.1 (osd.1) 107 : cluster [DBG] 6.1 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:24.826756+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:25.826919+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 414727 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:26.827096+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:56.196278+0000 osd.1 (osd.1) 108 : cluster [DBG] 4.7 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:56.210230+0000 osd.1 (osd.1) 109 : cluster [DBG] 4.7 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 109) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:56.196278+0000 osd.1 (osd.1) 108 : cluster [DBG] 4.7 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:56.210230+0000 osd.1 (osd.1) 109 : cluster [DBG] 4.7 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:27.827355+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.990326881s of 10.023887634s, submitted: 10
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58875904 unmapped: 835584 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:28.827502+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:58.213887+0000 osd.1 (osd.1) 110 : cluster [DBG] 6.b scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:58.231581+0000 osd.1 (osd.1) 111 : cluster [DBG] 6.b scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 111) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:58.213887+0000 osd.1 (osd.1) 110 : cluster [DBG] 6.b scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:58.231581+0000 osd.1 (osd.1) 111 : cluster [DBG] 6.b scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:29.828205+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:59.204601+0000 osd.1 (osd.1) 112 : cluster [DBG] 4.5 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:17:59.218668+0000 osd.1 (osd.1) 113 : cluster [DBG] 4.5 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 113) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:59.204601+0000 osd.1 (osd.1) 112 : cluster [DBG] 4.5 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:17:59.218668+0000 osd.1 (osd.1) 113 : cluster [DBG] 4.5 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:30.828380+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 417021 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:31.830649+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:32.830796+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 115 sent 113 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:02.187324+0000 osd.1 (osd.1) 114 : cluster [DBG] 6.e scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:02.204891+0000 osd.1 (osd.1) 115 : cluster [DBG] 6.e scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 115) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:02.187324+0000 osd.1 (osd.1) 114 : cluster [DBG] 6.e scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:02.204891+0000 osd.1 (osd.1) 115 : cluster [DBG] 6.e scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:33.831135+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:34.831341+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:35.831779+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 117 sent 115 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:05.177850+0000 osd.1 (osd.1) 116 : cluster [DBG] 4.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:05.191965+0000 osd.1 (osd.1) 117 : cluster [DBG] 4.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.8 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 420462 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 117) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:05.177850+0000 osd.1 (osd.1) 116 : cluster [DBG] 4.9 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:05.191965+0000 osd.1 (osd.1) 117 : cluster [DBG] 4.9 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:36.832180+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 119 sent 117 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:06.158642+0000 osd.1 (osd.1) 118 : cluster [DBG] 4.8 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:06.172862+0000 osd.1 (osd.1) 119 : cluster [DBG] 4.8 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 119) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:06.158642+0000 osd.1 (osd.1) 118 : cluster [DBG] 4.8 deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:06.172862+0000 osd.1 (osd.1) 119 : cluster [DBG] 4.8 deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:37.832459+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 121 sent 119 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:07.208220+0000 osd.1 (osd.1) 120 : cluster [DBG] 6.17 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:07.222392+0000 osd.1 (osd.1) 121 : cluster [DBG] 6.17 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.916434288s of 10.038483620s, submitted: 12
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 121) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:07.208220+0000 osd.1 (osd.1) 120 : cluster [DBG] 6.17 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:07.222392+0000 osd.1 (osd.1) 121 : cluster [DBG] 6.17 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:38.833133+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 123 sent 121 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:08.252604+0000 osd.1 (osd.1) 122 : cluster [DBG] 4.14 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:08.266700+0000 osd.1 (osd.1) 123 : cluster [DBG] 4.14 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 123) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:08.252604+0000 osd.1 (osd.1) 122 : cluster [DBG] 4.14 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:08.266700+0000 osd.1 (osd.1) 123 : cluster [DBG] 4.14 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:39.833360+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:40.833810+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 423906 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:41.834070+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 125 sent 123 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:11.240823+0000 osd.1 (osd.1) 124 : cluster [DBG] 4.12 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:11.254856+0000 osd.1 (osd.1) 125 : cluster [DBG] 4.12 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 125) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:11.240823+0000 osd.1 (osd.1) 124 : cluster [DBG] 4.12 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:11.254856+0000 osd.1 (osd.1) 125 : cluster [DBG] 4.12 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59039744 unmapped: 671744 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:42.834259+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 127 sent 125 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:12.226843+0000 osd.1 (osd.1) 126 : cluster [DBG] 4.10 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:12.241115+0000 osd.1 (osd.1) 127 : cluster [DBG] 4.10 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 127) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:12.226843+0000 osd.1 (osd.1) 126 : cluster [DBG] 4.10 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:12.241115+0000 osd.1 (osd.1) 127 : cluster [DBG] 4.10 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:43.834820+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 129 sent 127 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:13.235284+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.2 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:13.249338+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.2 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 129) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:13.235284+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.2 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:13.249338+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.2 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59056128 unmapped: 655360 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:44.835370+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:45.835966+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 427349 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:46.836144+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 131 sent 129 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:16.283606+0000 osd.1 (osd.1) 130 : cluster [DBG] 5.11 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:16.297593+0000 osd.1 (osd.1) 131 : cluster [DBG] 5.11 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 131) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:16.283606+0000 osd.1 (osd.1) 130 : cluster [DBG] 5.11 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:16.297593+0000 osd.1 (osd.1) 131 : cluster [DBG] 5.11 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:47.836767+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.910740852s of 10.036123276s, submitted: 10
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 2.1b deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:48.837174+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 133 sent 131 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:18.288934+0000 osd.1 (osd.1) 132 : cluster [DBG] 2.1b deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:18.302868+0000 osd.1 (osd.1) 133 : cluster [DBG] 2.1b deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 133) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:18.288934+0000 osd.1 (osd.1) 132 : cluster [DBG] 2.1b deep-scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:18.302868+0000 osd.1 (osd.1) 133 : cluster [DBG] 2.1b deep-scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:49.837905+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:50.838078+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 428497 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:51.838415+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:52.838741+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:53.838886+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59121664 unmapped: 589824 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:54.839078+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 135 sent 133 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:24.241540+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.1d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:24.259223+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.1d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 135) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:24.241540+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.1d scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:24.259223+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.1d scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:55.839351+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 430793 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:56.839641+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 137 sent 135 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:26.311663+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.1c scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:26.329368+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.1c scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 137) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:26.311663+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.1c scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:26.329368+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.1c scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:57.839881+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  log_queue is 2 last_log 139 sent 137 num 2 unsent 2 sending 2
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:27.274463+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  will send 2025-12-01T09:18:27.288558+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client handle_log_ack log(last 139) v1
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:27.274463+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.4 scrub starts
Dec 01 09:48:05 compute-0 ceph-osd[89052]: log_client  logged 2025-12-01T09:18:27.288558+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.4 scrub ok
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:58.840093+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:59.840473+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:00.840714+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:01.840921+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:02.841093+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:03.841390+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:04.841633+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:05.841887+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:06.842161+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:07.842378+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:08.842555+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:09.842799+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:10.842987+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:11.843249+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:12.843440+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:13.843610+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:14.843762+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:15.843997+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59211776 unmapped: 499712 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:16.844207+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:17.844479+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:18.844760+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:19.845053+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:20.845184+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:21.845387+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:22.845573+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:23.845694+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:24.845904+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:25.846060+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:26.846273+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:27.846461+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:28.846650+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:29.846935+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:30.847113+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:31.847253+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:32.847567+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59269120 unmapped: 442368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:33.847713+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:34.847911+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:35.848074+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:36.848257+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:37.848425+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:38.848620+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:39.848851+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:40.849037+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59293696 unmapped: 417792 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:41.849244+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:42.849433+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:43.849582+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:44.849789+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:45.849946+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:46.850094+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:47.850341+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:48.850474+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:49.850659+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:50.850808+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:51.850992+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:52.851165+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:53.851345+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:54.851508+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:55.851705+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:56.851967+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:57.852201+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:58.852383+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:59.852702+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:00.852877+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:01.853535+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:02.853799+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:03.853941+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:04.854093+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:05.854269+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:06.854570+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:07.854793+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:08.855049+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:09.855382+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:10.855532+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:11.855830+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:12.856101+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:13.856404+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:14.856583+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:15.856746+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:16.856922+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:17.857098+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:18.857278+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:19.857626+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:20.857820+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:21.857974+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:22.858216+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:23.858362+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:24.858530+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:25.858722+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:26.858870+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:27.859082+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:28.859308+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:29.859983+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:30.860139+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:31.860397+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:32.860589+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:33.860764+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:34.860939+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:35.861198+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:36.861368+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:37.861480+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:38.861608+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:39.861841+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:40.861993+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:41.862124+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:42.862314+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:43.862484+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:44.862613+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:45.862752+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:46.863025+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:47.863186+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:48.863422+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:49.864330+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:50.864568+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:51.864771+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:52.865028+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:53.865160+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:54.865359+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:55.865639+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:56.865824+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:57.866083+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:58.866429+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:59.866656+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:00.866873+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:01.867050+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:02.867203+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:03.867367+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:04.867488+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:05.867619+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:06.867744+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:07.867850+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:08.867994+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:09.868205+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:10.868358+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:11.868497+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:12.868636+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:13.868759+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:14.868874+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:15.869016+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:16.869151+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:17.869319+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:18.869426+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:19.869607+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:20.869736+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:21.869872+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:22.869999+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:23.870138+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:24.870278+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:25.870445+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:26.870573+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:27.870723+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:28.870874+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:29.871043+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:30.871179+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:31.871391+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:32.871579+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:33.871909+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:34.872024+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:35.872136+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:36.872278+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:37.872462+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:38.872591+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:39.872831+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:40.872970+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59760640 unmapped: 999424 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:41.873094+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:42.873221+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:43.873445+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:44.873639+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:45.873767+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:46.873986+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:47.874163+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:48.874379+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:49.874602+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:50.874784+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:51.874952+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:52.875103+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:53.875249+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:54.875364+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:55.875564+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:56.875735+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:57.875918+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:58.876069+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:59.876277+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:00.876537+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:01.876698+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:02.876871+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:03.877032+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:04.877213+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:05.877371+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:06.877549+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:07.877876+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:08.878064+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:09.878321+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:10.878474+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:11.878669+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:12.878860+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:13.879010+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:14.879252+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:15.879493+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:16.879654+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:17.879818+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:18.880206+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:19.881188+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:20.881762+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:21.882273+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:22.882497+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:23.882758+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:24.882921+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:25.883397+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:26.883570+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:27.883791+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:28.884105+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:29.884502+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:30.884744+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:31.885049+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:32.885342+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:33.885634+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:34.885940+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:35.886126+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:36.886337+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:37.886547+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:38.886927+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:39.887273+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:40.887478+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:41.887623+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:42.887814+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:43.887999+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:44.888178+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:45.888410+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:46.888551+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:47.888762+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:48.888966+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:49.889233+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:50.889357+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:51.889529+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:52.889661+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:53.889845+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:54.890276+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:55.890475+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:56.890623+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:57.890994+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:58.891169+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:59.891906+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:00.892101+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:01.892543+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:02.892745+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:03.892958+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:04.893389+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:05.893576+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:06.893819+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:07.893989+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:08.894141+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:09.894385+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:10.894606+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:11.894755+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:12.894949+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:13.895143+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:14.895327+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:15.895475+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:16.895615+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:17.895737+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:18.895927+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:19.896375+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:20.896559+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:21.896727+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:22.896896+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:23.897043+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:24.897237+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:25.897382+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:26.897543+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:27.897680+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:28.897798+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:29.897993+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:30.898158+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:31.898316+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:32.898449+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:33.898603+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:34.898769+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:35.898896+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:36.899045+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:37.899264+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:38.899492+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:39.899900+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:40.900043+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:41.900191+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:42.900335+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:43.900493+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:44.900640+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:45.900789+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:46.900990+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:47.901140+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:48.901285+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:49.901551+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:50.901755+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:51.901892+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:52.902050+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:53.902199+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:54.902413+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:55.902571+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:56.902675+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:58.028139+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:59.028953+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:00.029243+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:01.029394+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:02.029568+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:03.029728+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:04.029889+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:05.030062+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:06.030234+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:07.030433+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:08.030582+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:09.030751+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:10.030899+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:11.031021+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:12.031150+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:13.031307+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:14.031446+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:15.031570+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:16.031747+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:17.031927+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:18.032118+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:19.032337+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:20.032517+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:21.032665+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:22.032802+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:23.032942+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:24.033171+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:25.033402+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:26.033666+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:27.034433+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:28.034576+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:29.034741+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:30.035128+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:31.035367+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:32.035515+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:33.035761+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:34.035929+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60284928 unmapped: 475136 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:35.036365+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:36.036491+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60293120 unmapped: 466944 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:37.036611+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:38.036728+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:39.036892+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60301312 unmapped: 458752 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:40.037102+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:41.037430+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:42.037678+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60309504 unmapped: 450560 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:43.038021+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:44.038163+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60317696 unmapped: 442368 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:45.038347+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 16.04 MB, 0.03 MB/s
                                           Interval WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:46.038531+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:47.038655+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:48.038800+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:49.038960+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:50.039181+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:51.039464+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:52.039677+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:53.039914+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:54.040089+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:55.040324+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:56.040470+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:57.040609+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:58.040780+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:59.040874+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:00.041195+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:01.041365+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:02.041530+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:03.041687+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:04.041841+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:05.041978+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:06.042111+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:07.042262+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:08.042418+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:09.042606+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:10.042803+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:11.042910+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:12.043050+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:13.043221+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:14.043352+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:15.043511+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:16.043643+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:17.044022+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:18.044233+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:19.045350+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:20.045535+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:21.045667+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:22.045801+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:23.045962+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:24.046104+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:25.046207+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:26.046563+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:27.046764+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:28.046904+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:29.047222+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:30.047458+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:31.047607+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:32.047823+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:33.047973+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:34.048167+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:35.048343+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:36.048494+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:37.048617+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:38.048764+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:39.048917+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:40.049090+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:41.049230+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:42.049442+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:43.049603+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:44.049737+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:45.049905+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:46.050175+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:47.050391+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:48.050572+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:49.050698+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:50.050900+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:51.051045+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:52.051220+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:53.051375+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:54.051595+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:55.051792+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:56.052160+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:57.052380+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:58.052562+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:59.052747+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:00.053199+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:01.053365+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:02.053544+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:03.053731+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:04.053890+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:05.054081+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:06.054224+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:07.054364+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:08.054530+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:09.054683+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:10.054919+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:11.055047+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:12.055193+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:13.055397+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:14.055538+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:15.055693+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:16.055833+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:17.056054+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:18.056184+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:19.056469+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:20.056664+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:21.056826+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:22.056960+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:23.057096+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:24.057241+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:25.057376+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:26.057558+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:27.057717+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:28.057957+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:29.058151+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:30.058344+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:31.058516+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:32.058686+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:33.058849+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:34.059014+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:35.059247+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:36.059465+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:37.059681+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:38.059875+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:39.059995+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:40.060142+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:41.060276+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:42.060427+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:43.060555+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60768256 unmapped: 1040384 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:44.060726+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:45.060903+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:46.061055+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60776448 unmapped: 1032192 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:47.061188+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:48.061347+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:49.061527+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:50.061812+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:51.061980+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:52.062113+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:53.062252+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:54.062382+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:55.062612+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:56.062756+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:57.062892+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:58.063035+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:59.063197+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:00.063419+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:01.063597+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:02.063721+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:03.063886+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:04.064079+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:05.064214+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:06.064364+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:07.064489+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:08.064603+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:09.064727+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:10.065076+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:11.065243+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:12.065357+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:13.065481+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:14.065592+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:15.065721+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:16.065872+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:17.066026+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:18.066176+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:19.066323+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:20.066463+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:21.066590+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:22.066708+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:23.066838+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:24.066966+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:25.067104+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:26.067242+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:27.067437+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:28.067600+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:29.067765+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:30.068025+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:31.068204+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:32.068459+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:33.068624+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:34.068747+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:35.068925+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:36.069076+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:37.069501+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:38.069851+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:39.070004+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:40.070240+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:41.070434+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:42.070622+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:43.070766+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:44.070954+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:45.071134+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:46.071302+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:47.071445+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:48.071575+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:49.071739+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:50.072090+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:51.072316+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:52.072474+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:53.072619+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:54.072740+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:55.072857+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:56.072968+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:57.073128+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:58.073343+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:59.073473+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:00.073677+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:01.073809+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:02.074002+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:03.074259+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:04.074443+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:05.074615+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:06.074924+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:07.075066+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:08.075202+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:09.075352+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:10.075647+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:11.075792+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:12.075940+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:13.076117+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:14.076366+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:15.076540+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:16.076729+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:17.076915+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:18.077084+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:19.077317+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:20.077479+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:21.077644+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:22.077863+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:23.078059+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:24.078183+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:25.078375+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:26.078549+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:27.078676+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:28.078849+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:29.078999+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:30.079211+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:31.079357+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:32.079502+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:33.080000+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:34.080153+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:35.080368+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:36.080504+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:37.080692+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60784640 unmapped: 1024000 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:38.080895+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:39.081076+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:40.081261+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:41.081430+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:42.081571+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:43.081844+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:44.082009+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:45.082255+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:46.082399+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:47.082571+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:48.082745+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:49.082975+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:50.083222+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:51.083439+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:52.083585+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:53.083776+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:54.084151+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:55.084500+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:56.084641+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:57.084765+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:58.084958+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:59.085114+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:00.085320+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:01.085471+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:02.085617+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:03.085797+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:04.085971+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:05.086141+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:06.086321+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:07.086498+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:08.086700+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:09.086897+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:10.087065+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:11.087232+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:12.087357+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:13.087560+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:14.087750+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:15.087902+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:16.088202+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:17.088418+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:18.088584+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:19.088725+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:20.088918+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:21.089056+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:22.089206+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:23.089402+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:24.089547+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:25.089675+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:26.089831+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:27.089984+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:28.090115+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:29.090275+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:30.090530+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:31.090674+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:32.090841+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:33.090999+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1015808 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:34.091198+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:35.091491+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:36.091787+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:37.092044+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:38.092271+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:39.092465+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:40.092679+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:41.092815+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:42.092954+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:43.093138+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:44.093339+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:45.093465+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:46.093613+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:47.093753+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:48.093881+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:49.094012+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:50.094721+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:51.095252+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:52.095515+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:53.095652+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:54.097469+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:55.097837+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:56.098087+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:57.098261+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:58.098542+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b271c00 session 0x555f19f23860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b3fe400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 ms_handle_reset con 0x555f1b3fe800 session 0x555f1b28fc20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b271c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:59.098758+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:00.098938+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:01.099110+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:02.099285+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:03.099464+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:04.099622+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:05.099793+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:06.100014+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:07.100226+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:08.100422+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:09.100613+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:10.100891+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:11.101085+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:12.101233+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:13.101397+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:14.101579+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:15.101713+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:16.101850+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:17.102011+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:18.102135+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:19.102323+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:20.102556+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:21.102719+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:22.102894+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:23.103204+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:24.103347+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:25.104077+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:26.104222+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:27.104362+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:28.104487+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:29.104639+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:30.104789+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:31.104928+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:32.105062+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:33.105198+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:34.105341+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:35.105463+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:36.105639+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:37.105815+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:38.106058+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:39.106359+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:40.106526+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:41.106660+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:42.106846+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:43.106979+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:44.107213+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:45.107348+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:46.107551+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:47.107764+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:48.107953+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:49.108090+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:50.108249+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:51.108409+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:52.108556+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:53.108703+0000)
Dec 01 09:48:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Dec 01 09:48:05 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/840775449' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:54.109024+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:55.109217+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:56.109373+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:57.109532+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:58.109730+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:59.109871+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:00.110410+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:01.110565+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:02.110761+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:03.110910+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:04.111236+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:05.111374+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:06.111708+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:07.111915+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:08.112208+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:09.112424+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:10.114394+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:11.114596+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:12.114817+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:13.115006+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:14.115144+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:15.115310+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:16.115509+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:17.115674+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:18.115892+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:19.116084+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:20.116354+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:21.116604+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:22.116748+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:23.116853+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:24.117001+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:25.117151+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:26.117507+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:27.117856+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:28.118064+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:29.118232+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:30.118461+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:31.118595+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:32.118781+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:33.118917+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:34.119092+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:35.119369+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:36.119528+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:37.119648+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:38.119802+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:39.119924+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:40.120227+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:41.120370+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:42.120560+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:43.120712+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:44.121082+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:45.121228+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:46.121406+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:47.121550+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:48.121671+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:49.121834+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:50.122068+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:51.122268+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:52.122472+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:53.122600+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:54.122768+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:55.122935+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:56.123312+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:57.123441+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:58.123578+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:59.123715+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:00.123919+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:01.124322+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:02.124486+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:03.124736+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:04.124928+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:05.125078+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:06.125602+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:07.126045+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:08.126191+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:09.126627+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:10.127006+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:11.127668+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:12.128035+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:13.128420+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:14.128570+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:15.128720+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:16.128860+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:17.129108+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:18.129273+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:19.129451+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:20.129627+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:21.129809+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:22.129939+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:23.130100+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:24.130264+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:25.130404+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:26.130538+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:27.130705+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:28.130844+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:29.131051+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:30.131263+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:31.131774+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:32.131892+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:33.132060+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:34.133674+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:35.133798+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:36.134355+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:37.134495+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:38.134636+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:39.134775+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:40.134931+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:41.135427+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:42.135577+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:43.135756+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:44.135874+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:45.135986+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:46.136130+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:47.136311+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:48.136474+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:49.136630+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:50.136853+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:51.136980+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:52.137148+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:53.137303+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:54.137444+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:55.137657+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:56.137836+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:57.138082+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:58.138265+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:59.138399+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:00.138591+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:01.138707+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:02.138857+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:03.139363+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:04.139534+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:05.139704+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:06.139894+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:07.140316+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:08.140562+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:09.140878+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:10.141098+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:11.141280+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:12.141435+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:13.141668+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:14.141897+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:15.142034+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:16.142268+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60801024 unmapped: 1007616 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:17.142429+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:18.142667+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:19.142924+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:20.143154+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:21.143359+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:22.143536+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:23.143765+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:24.143936+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:25.144283+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:26.144566+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:27.144969+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:28.145239+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:29.145536+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:30.145873+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:31.146070+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:32.146238+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:33.146430+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:34.146602+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:35.146754+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:36.146889+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:37.147059+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:38.147239+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:39.147391+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:40.147605+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:41.147742+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:42.147895+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:43.148059+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:44.148263+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:45.148486+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:46.148635+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:47.148803+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:48.149010+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:49.149221+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:50.149474+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:51.149634+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:52.149776+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:53.149940+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:54.150139+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:55.150362+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:56.150553+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:57.150665+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:58.150783+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:59.150914+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:00.151092+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:01.151249+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:02.151548+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:03.151658+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:04.151926+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:05.152153+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:06.152350+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:07.152527+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:08.152698+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:09.152834+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:10.153089+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:11.153346+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:12.153543+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:13.153740+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:14.153873+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:15.154038+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:16.154251+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:17.154419+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:18.154562+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:19.154725+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:20.154911+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:21.155049+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:22.155190+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:23.155331+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:24.155534+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:25.155696+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:26.155876+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:27.156073+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:28.156235+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:29.156380+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:30.156562+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:31.156845+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:32.157077+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:33.157249+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:34.157488+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:35.157652+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:36.157805+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:37.157935+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:38.158050+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:39.158235+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:40.158571+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:41.158711+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:42.158900+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:43.159032+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:44.159199+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:45.159399+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Cumulative writes: 4343 writes, 19K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4343 writes, 398 syncs, 10.91 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.054       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f19149090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555f191491f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.0001 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:46.159574+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:47.159733+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:48.159894+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:49.160036+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:50.160236+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:51.160376+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:52.160531+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:53.160762+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:54.160901+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:55.161023+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:56.161178+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:57.161423+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:58.161619+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:59.161834+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:00.163065+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:01.163208+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:02.163373+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:03.163530+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:04.163693+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:05.163846+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:06.163977+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:07.164145+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:08.164425+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:09.164641+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:10.164871+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:11.165047+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:12.165211+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:13.165429+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:14.165618+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:15.165800+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:16.165984+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:17.166157+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:18.166375+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:19.166530+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:20.167011+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:21.167221+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:22.167360+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:23.167494+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:24.167629+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:25.167777+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:26.167912+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:27.168065+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:28.168278+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:29.168506+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:30.168855+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:31.169012+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:32.169218+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:33.169399+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:34.169535+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:35.169669+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:36.169795+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:37.169923+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:38.170028+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:39.170151+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:40.170282+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:41.170452+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:42.170648+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:43.170817+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:44.170995+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:45.171170+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:46.171392+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:47.171614+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:48.171897+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:49.172044+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:50.172246+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:51.172432+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:52.172645+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:53.172874+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:54.173023+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:55.173224+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:56.173386+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:57.173547+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:58.173741+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:59.173867+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:00.174049+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:01.174243+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:02.174385+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:03.174524+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:04.174667+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:05.174866+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:06.175102+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:07.175268+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:08.175474+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:09.175733+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:10.175926+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:11.176089+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:12.176251+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:13.176426+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:14.176586+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:15.176753+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:16.176938+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:17.177232+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:18.177443+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:19.177618+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:20.177817+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:21.177962+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:22.178157+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:23.178378+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:24.178592+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:25.178775+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:26.178963+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:27.179168+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:28.179382+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:29.179571+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:30.179744+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:31.179981+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:32.180191+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:33.180374+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:34.180542+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:35.180755+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:36.180903+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:37.181221+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:38.181591+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:39.181737+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:40.181967+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:41.182141+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:42.182275+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60809216 unmapped: 999424 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:43.182443+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:44.182637+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:45.182815+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:46.182963+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 431940 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:47.183161+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:48.183350+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:49.183583+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60817408 unmapped: 991232 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [47,48], i have 47, src has [1,48]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 1081.428955078s of 1081.454589844s, submitted: 8
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 47 handle_osd_map epochs [48,48], i have 48, src has [1,48]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 48 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:50.183842+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 48 heartbeat osd_stat(store_statfs(0x4fe0e7000/0x0/0x4ffc00000, data 0x9dc64/0xe7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 60850176 unmapped: 958464 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 48 handle_osd_map epochs [48,49], i have 48, src has [1,49]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:51.184036+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 552245 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61202432 unmapped: 17391616 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 49 heartbeat osd_stat(store_statfs(0x4fe0e2000/0x0/0x4ffc00000, data 0x9f226/0xeb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 49 handle_osd_map epochs [49,50], i have 49, src has [1,50]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 50 handle_osd_map epochs [50,50], i have 50, src has [1,50]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 50 ms_handle_reset con 0x555f1b858c00 session 0x555f1abae780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:52.184225+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61243392 unmapped: 17350656 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:53.184413+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 50 handle_osd_map epochs [51,51], i have 50, src has [1,51]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 51 ms_handle_reset con 0x555f1b858000 session 0x555f1ab3f680
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:54.184585+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:55.184982+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 51 heartbeat osd_stat(store_statfs(0x4fc8d7000/0x0/0x4ffc00000, data 0x18a33eb/0x18f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:56.185186+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 614157 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:57.185359+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:58.185488+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:59.185662+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61251584 unmapped: 17342464 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:00.185910+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 51 heartbeat osd_stat(store_statfs(0x4fc8d7000/0x0/0x4ffc00000, data 0x18a33eb/0x18f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.322311401s of 10.521731377s, submitted: 31
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:01.186075+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:02.186366+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:03.186564+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:04.186736+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:05.186933+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:06.187096+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:07.187274+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:08.187519+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:09.187763+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:10.188003+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:11.188173+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:12.188351+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:13.188547+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:14.188752+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:15.188970+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:16.189150+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:17.189430+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:18.189623+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:19.189788+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:20.190451+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:21.191033+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 615609 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:22.192426+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fc8d6000/0x0/0x4ffc00000, data 0x18a488b/0x18f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:23.193262+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:24.193686+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:25.193847+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61186048 unmapped: 17408000 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.412767410s of 25.425762177s, submitted: 13
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:26.194013+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 620555 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1b8eb400 session 0x555f1b28e5a0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61243392 unmapped: 17350656 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 53 heartbeat osd_stat(store_statfs(0x4fc8d2000/0x0/0x4ffc00000, data 0x18a5e55/0x18fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:27.194362+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1b8eb000 session 0x555f1aa64960
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 53 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa645a0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61259776 unmapped: 17334272 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:28.194789+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 53 heartbeat osd_stat(store_statfs(0x4fc8d2000/0x0/0x4ffc00000, data 0x18a5e55/0x18fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61259776 unmapped: 17334272 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:29.195081+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 53 handle_osd_map epochs [54,54], i have 53, src has [1,54]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa33a40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62332928 unmapped: 16261120 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1b858000 session 0x555f1aa32d20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 54 ms_handle_reset con 0x555f1cb08800 session 0x555f1aa32000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:30.195390+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62332928 unmapped: 16261120 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:31.196266+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 624199 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 54 handle_osd_map epochs [55,55], i have 54, src has [1,55]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 55 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa32780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61358080 unmapped: 17235968 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:32.196508+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 61390848 unmapped: 17203200 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:33.196700+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 55 heartbeat osd_stat(store_statfs(0x4fc8cc000/0x0/0x4ffc00000, data 0x18a8a1d/0x1901000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 55 handle_osd_map epochs [55,56], i have 55, src has [1,56]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 56 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa32f00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62455808 unmapped: 16138240 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:34.196971+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62455808 unmapped: 16138240 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:35.197209+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 56 handle_osd_map epochs [57,57], i have 56, src has [1,57]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 57 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa64780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62504960 unmapped: 16089088 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:36.197456+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 634040 data_alloc: 218103808 data_used: 28672
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 62521344 unmapped: 16072704 heap: 78594048 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.331473351s of 11.440871239s, submitted: 32
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:37.197850+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 57 handle_osd_map epochs [58,58], i have 57, src has [1,58]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 58 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab53a40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 58 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa652c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 15613952 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:38.198035+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 58 handle_osd_map epochs [59,59], i have 58, src has [1,59]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 59 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab3e3c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 23699456 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:39.198255+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 59 heartbeat osd_stat(store_statfs(0x4fa8b9000/0x0/0x4ffc00000, data 0x38adc5c/0x3914000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 60 ms_handle_reset con 0x555f1b8eb000 session 0x555f1a5cde00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 60 ms_handle_reset con 0x555f1b858000 session 0x555f1c7ac780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 63479808 unmapped: 23511040 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:40.198693+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 60 handle_osd_map epochs [61,61], i have 60, src has [1,61]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 61 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab6e000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 22388736 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:41.198864+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 61 handle_osd_map epochs [61,62], i have 61, src has [1,62]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 62 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa610e0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1009042 data_alloc: 218103808 data_used: 45056
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 62 ms_handle_reset con 0x555f1cb08800 session 0x555f1aa5f860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 21241856 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:42.199046+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 62 handle_osd_map epochs [62,63], i have 62, src has [1,63]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb23000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 63 handle_osd_map epochs [63,63], i have 63, src has [1,63]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1cb23000 session 0x555f1ab532c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1cb09000 session 0x555f1b29cd20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 63 ms_handle_reset con 0x555f1b8eb400 session 0x555f1c7ac5a0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 20971520 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:43.200606+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 20881408 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:44.200758+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa5f2c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1b858000 session 0x555f1ab53680
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 65 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3ef00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 20799488 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 65 heartbeat osd_stat(store_statfs(0x4fc89a000/0x0/0x4ffc00000, data 0x18b8e3b/0x1930000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:45.200905+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 66 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3e780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 19709952 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:46.201073+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 705756 data_alloc: 218103808 data_used: 65536
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 67 ms_handle_reset con 0x555f1b858000 session 0x555f1a5cd0e0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 19628032 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:47.201248+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.978338242s of 10.211093903s, submitted: 268
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 19439616 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 68 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa610e0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:48.201445+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 68 handle_osd_map epochs [68,69], i have 68, src has [1,69]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 69 ms_handle_reset con 0x555f1b8eb400 session 0x555f1c783c20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 69 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa32b40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 19349504 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:49.201611+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 69 handle_osd_map epochs [69,70], i have 69, src has [1,70]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 70 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab6fc20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 70 ms_handle_reset con 0x555f1b858000 session 0x555f1c783a40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 19234816 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:50.202110+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1b858c00 session 0x555f1aa64f00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67805184 unmapped: 19185664 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 71 heartbeat osd_stat(store_statfs(0x4fb2e2000/0x0/0x4ffc00000, data 0x18bf0bd/0x1936000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:51.202264+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 712002 data_alloc: 218103808 data_used: 73728
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67846144 unmapped: 19144704 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:52.202393+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1b8eb400 session 0x555f1ab3f860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67846144 unmapped: 19144704 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:53.202549+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3e780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 19161088 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 71 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab3f4a0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:54.202714+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 19161088 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:55.202870+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 71 handle_osd_map epochs [73,73], i have 71, src has [1,73]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 71 handle_osd_map epochs [72,73], i have 71, src has [1,73]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 73 ms_handle_reset con 0x555f1b858000 session 0x555f1ab3fc20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fb2bd000/0x0/0x4ffc00000, data 0x18e5d82/0x1960000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69132288 unmapped: 17858560 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:56.203040+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 722395 data_alloc: 218103808 data_used: 81920
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:57.203187+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fb2bd000/0x0/0x4ffc00000, data 0x18e5d82/0x1960000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:58.203362+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 17833984 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:59.203470+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.536053658s of 12.151283264s, submitted: 168
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 74 ms_handle_reset con 0x555f1cb09000 session 0x555f1aa614a0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69206016 unmapped: 17784832 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:00.203660+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb08c00 session 0x555f1aa603c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 17735680 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:01.203888+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb22c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 731868 data_alloc: 218103808 data_used: 102400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb22c00 session 0x555f1aa330e0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 17547264 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:02.204415+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1b858000 session 0x555f1ab53e00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 75 ms_handle_reset con 0x555f1cb08800 session 0x555f1b28e960
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 17547264 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:03.204779+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 75 heartbeat osd_stat(store_statfs(0x4fb2b7000/0x0/0x4ffc00000, data 0x18e8982/0x1966000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 76 handle_osd_map epochs [76,77], i have 76, src has [1,77]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 77 handle_osd_map epochs [77,77], i have 77, src has [1,77]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69484544 unmapped: 17506304 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 77 ms_handle_reset con 0x555f1cb08c00 session 0x555f1c782960
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:04.205001+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 78 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab661e0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 17498112 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:05.205155+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 78 handle_osd_map epochs [78,79], i have 78, src has [1,79]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 heartbeat osd_stat(store_statfs(0x4fb2af000/0x0/0x4ffc00000, data 0x18eb9f2/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:06.205373+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 748246 data_alloc: 218103808 data_used: 102400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:07.205605+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 17448960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb22800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb22800 session 0x555f1ab67860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1b858000 session 0x555f1ab66d20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab66000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:08.205780+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08c00 session 0x555f1ab66b40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb22400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb22400 session 0x555f1aa32b40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09000 session 0x555f1ab67c20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1b858000 session 0x555f1c782000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab66b40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09c00 session 0x555f1ab67860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 17399808 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 ms_handle_reset con 0x555f1cb09800 session 0x555f1aa61c20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:09.205980+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 heartbeat osd_stat(store_statfs(0x4fb2aa000/0x0/0x4ffc00000, data 0x18ee51a/0x1974000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 17399808 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:10.206530+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.622175217s of 10.975051880s, submitted: 122
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 80 ms_handle_reset con 0x555f1cb09400 session 0x555f1ab6e3c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 17080320 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:11.206702+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 757501 data_alloc: 218103808 data_used: 114688
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 80 heartbeat osd_stat(store_statfs(0x4fb281000/0x0/0x4ffc00000, data 0x1913a15/0x199c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:12.206891+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 80 heartbeat osd_stat(store_statfs(0x4fb281000/0x0/0x4ffc00000, data 0x1913a15/0x199c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:13.207062+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:14.207268+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 17072128 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:15.207435+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 80 ms_handle_reset con 0x555f1cb09800 session 0x555f1abae3c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69836800 unmapped: 17154048 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:16.207585+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb09c00 session 0x555f1ab3e960
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb08000 session 0x555f1aa652c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb08c00 session 0x555f1b4b92c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb23400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 81 ms_handle_reset con 0x555f1cb23400 session 0x555f1aa60780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 766180 data_alloc: 218103808 data_used: 135168
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 69959680 unmapped: 17031168 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:17.207719+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 82 ms_handle_reset con 0x555f1cb08000 session 0x555f1ab6fc20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 16973824 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:18.207885+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb08c00 session 0x555f1ab66d20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 15908864 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 83 heartbeat osd_stat(store_statfs(0x4fb276000/0x0/0x4ffc00000, data 0x1917fc3/0x19a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:19.208046+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb09800 session 0x555f1c782000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb09c00 session 0x555f1aa60780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb23c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb23c00 session 0x555f1aa652c0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 15892480 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:20.208236+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 15892480 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 83 ms_handle_reset con 0x555f1cb08000 session 0x555f1ab67860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:21.208441+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08c00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.769915581s of 10.991518974s, submitted: 90
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 771276 data_alloc: 218103808 data_used: 139264
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 15884288 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:22.208653+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1cb08c00 session 0x555f1b91a780
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 15859712 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:23.208849+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1b858000 session 0x555f1b28e960
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 84 ms_handle_reset con 0x555f1cb08800 session 0x555f1ab6f860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 84 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x19191dd/0x19a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb09800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 15826944 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:24.209058+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72237056 unmapped: 14753792 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:25.209213+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72278016 unmapped: 14712832 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:26.209349+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1cb09800 session 0x555f1b29cd20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 776985 data_alloc: 218103808 data_used: 139264
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72302592 unmapped: 14688256 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:27.209535+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 14671872 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:28.209777+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72318976 unmapped: 14671872 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:29.211060+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b858c00 session 0x555f1ab53a40
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b8eb400 session 0x555f1abaf860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 86 heartbeat osd_stat(store_statfs(0x4fb293000/0x0/0x4ffc00000, data 0x18f7c42/0x1988000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 86 ms_handle_reset con 0x555f1b858000 session 0x555f1b29d680
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 14639104 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:30.211226+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 14639104 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:31.211415+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.348530769s of 10.076562881s, submitted: 127
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 87 ms_handle_reset con 0x555f1cb08000 session 0x555f1b4b9860
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1cb08800
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 775824 data_alloc: 218103808 data_used: 143360
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72368128 unmapped: 14622720 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:32.211534+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _renew_subs
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 88 ms_handle_reset con 0x555f1cb08800 session 0x555f1c782960
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:33.211648+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:34.211785+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fb2b5000/0x0/0x4ffc00000, data 0x18d66d9/0x1967000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:35.211934+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:36.212092+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 777730 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:37.212824+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:38.212990+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fb2b5000/0x0/0x4ffc00000, data 0x18d66d9/0x1967000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 88 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:39.213163+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 89 heartbeat osd_stat(store_statfs(0x4fb2b3000/0x0/0x4ffc00000, data 0x18d7b95/0x196a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:40.213347+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:41.213495+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 779854 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 14614528 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:42.213644+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.189030647s of 11.284521103s, submitted: 85
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b3000/0x0/0x4ffc00000, data 0x18d7b95/0x196a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:43.213775+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:44.213932+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:45.214089+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:46.214348+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:47.214593+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:48.214738+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:49.214911+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:50.215160+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:51.215432+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:52.215585+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:53.215723+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:54.215866+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:55.216154+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:56.216280+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:57.216403+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:58.216532+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:59.216670+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:00.216864+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:01.216973+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:02.217108+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:03.217276+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:04.217440+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:05.217585+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:06.217745+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:07.217902+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:08.218584+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:09.218741+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:10.218992+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:11.219183+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:12.219368+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:13.219546+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:14.219740+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:15.219921+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:16.220429+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 14581760 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:17.220707+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:18.221010+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:19.221173+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:20.221468+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:21.221654+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:22.221835+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:23.222077+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:24.222339+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:25.222542+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:26.222808+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:27.222975+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:28.223137+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:29.223327+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:30.224003+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:31.224189+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:32.228920+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:33.233185+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:34.236264+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:35.236970+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:36.238224+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:37.238864+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:38.241145+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:39.243172+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:40.243800+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:41.245062+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:42.245386+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:43.245682+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:44.245980+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:45.246186+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:46.246419+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:47.246601+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:48.246770+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:49.246932+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:50.247260+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:51.247441+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:52.247612+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:53.247825+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:54.247997+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:55.248172+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:56.248324+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:57.248476+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:58.248642+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 14573568 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:59.248790+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72613888 unmapped: 14376960 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config show' '{prefix=config show}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:00.249023+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 14114816 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:01.249180+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 14147584 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:02.249324+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 13893632 heap: 86990848 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:03.249452+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'perf dump' '{prefix=perf dump}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73039872 unmapped: 24993792 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'perf schema' '{prefix=perf schema}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:04.249559+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:05.249673+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:06.249800+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:07.249997+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:08.250157+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:09.250307+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:10.250467+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:11.250586+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:12.250724+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:13.250873+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:14.250981+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:15.251135+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:16.251327+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:17.251498+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:18.251632+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:19.251771+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:20.251925+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:21.252056+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:22.252215+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:23.252339+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:24.252495+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:25.252621+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:26.252778+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:27.252909+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:28.253057+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:29.253183+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:30.253361+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:31.253520+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:32.253660+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:33.253826+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:34.253971+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:35.254145+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:36.254363+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:37.254657+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:38.254844+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:39.255027+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:40.255363+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:41.255538+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:42.255686+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:43.255888+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:44.256088+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:45.256257+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:46.256422+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:47.256558+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:48.256674+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:49.256820+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:50.256993+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:51.257125+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:52.257305+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:53.257463+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:54.257636+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:55.257808+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:56.258007+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:57.258221+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:58.258356+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:59.259442+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:00.259625+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:01.259795+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:02.259999+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:03.260147+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:04.260343+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:05.260634+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:06.260857+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:07.261052+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:08.261264+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:09.261417+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:10.261663+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:11.261819+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:12.261938+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:13.262078+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:14.262236+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:15.262342+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:16.262552+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:17.262702+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:18.262889+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:19.263101+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:20.263345+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:21.263593+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:22.263757+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:23.263901+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:24.264085+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:25.264260+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:26.264575+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:27.264798+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:28.265014+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:29.265206+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:30.265434+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:31.265637+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:32.265723+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:33.265879+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:34.266091+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:35.266237+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:36.266369+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:37.266503+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:38.266633+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:39.266826+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:40.267042+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:41.267129+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:42.267263+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:43.267437+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:44.267600+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:45.267784+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:46.267946+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:47.268156+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:48.268339+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:49.268542+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:50.268745+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:51.268906+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72876032 unmapped: 25157632 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:52.269021+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:53.269148+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:54.269399+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:55.269566+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:56.269749+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:57.269886+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:58.742378+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:59.742586+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:00.742763+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:01.742960+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:02.743285+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:03.743525+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:04.743677+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:05.743838+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:06.743961+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:07.744144+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:08.744387+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:09.744578+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:10.744809+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:11.744956+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:12.745120+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:13.745339+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:14.746025+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:15.746199+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:16.746359+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:17.746477+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:18.746610+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:19.746746+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:20.746935+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:21.747058+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:22.747193+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:23.747375+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:24.747533+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:25.747692+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:26.747904+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:27.748106+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:28.748257+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:29.748415+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:30.748664+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:31.748936+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:32.749139+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:33.749329+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:34.749528+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:35.749696+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:36.749872+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 25149440 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:37.750078+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:38.750261+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:39.750428+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:40.750603+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:41.750751+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:42.751047+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:43.751175+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:44.751378+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:45.751573+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:46.751728+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:47.751906+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:48.752092+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:49.752247+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:50.752499+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:51.752658+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:52.752797+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:53.752953+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:54.753102+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:55.753328+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:56.753514+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:57.753683+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:58.753840+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:59.754046+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:00.754264+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:01.754457+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:02.754649+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:03.754831+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:04.755034+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:05.758339+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:06.758594+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:07.758770+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:08.758956+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:09.759122+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:10.759462+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:11.759599+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:12.759725+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:13.759867+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:14.760018+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:15.760184+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:16.760344+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:17.760537+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:18.760719+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:19.761067+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:20.761379+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:21.761542+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 25141248 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:22.761672+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:23.761823+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:24.762224+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:25.762412+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:26.762537+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:27.762713+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:28.762847+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:29.763056+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:30.763250+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:31.763458+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:32.763637+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:33.763799+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:34.763963+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:35.764121+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:36.764282+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:37.764466+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:38.764658+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:39.764856+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:40.765090+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:41.765239+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:42.765378+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:43.765510+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:44.765706+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:45.765869+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:46.766072+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:47.766350+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:48.766548+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:49.767271+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:50.767599+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:51.767729+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:52.767946+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:53.768091+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:54.768268+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:55.768489+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:56.768653+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:57.768810+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:58.768970+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:59.769127+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:00.769489+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:01.769657+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:02.769840+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:03.769968+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:04.770120+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:05.770379+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:06.770628+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:07.770834+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:08.770999+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:09.771159+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:10.771392+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:11.772176+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:12.772983+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:13.773122+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:14.773351+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:15.773537+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:16.773731+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:17.773923+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:18.774090+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:19.774244+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:20.774462+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:21.774687+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:22.774895+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:23.775041+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:24.775211+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:25.775379+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:26.775576+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:27.775743+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:28.775978+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:29.776122+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:30.776332+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:31.776503+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:32.776657+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:33.776825+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 25133056 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:34.776962+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:35.777132+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:36.777344+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:37.777497+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:38.777713+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:39.777918+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:40.778150+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:41.778361+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:42.778541+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:43.778671+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:44.778845+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.2 total, 600.0 interval
                                           Cumulative writes: 6021 writes, 24K keys, 6021 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6021 writes, 1127 syncs, 5.34 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1678 writes, 4735 keys, 1678 commit groups, 1.0 writes per commit group, ingest: 2.54 MB, 0.00 MB/s
                                           Interval WAL: 1678 writes, 729 syncs, 2.30 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:45.778993+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:46.779182+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:47.779401+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:48.779679+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:49.779860+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:50.780108+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:51.780357+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:52.780515+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:53.780714+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:54.780902+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:55.781070+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 25124864 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:56.781273+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: mgrc ms_handle_reset ms_handle_reset con 0x555f1a4edc00
Dec 01 09:48:05 compute-0 ceph-osd[89052]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec 01 09:48:05 compute-0 ceph-osd[89052]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: get_auth_request con 0x555f1cb08800 auth_method 0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: mgrc handle_mgr_configure stats_period=5
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:57.781468+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b271000 session 0x555f19f23680
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b858000
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b3fe400 session 0x555f1c7ac1e0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b859400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:58.781627+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b271c00 session 0x555f1aabb0e0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b3fe400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:59.781839+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:00.782019+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:01.782235+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:02.782403+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:03.782600+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:04.782785+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:05.783054+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:06.783237+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:07.783440+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:08.783651+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:09.783829+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:10.784022+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:11.784250+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:12.784450+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:13.784598+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:14.784773+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:15.784912+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:16.785120+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:17.785394+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:18.785616+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:19.785813+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:20.786070+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:21.786233+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:22.786411+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:23.786546+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:24.786695+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:25.786842+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:26.786985+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:27.787149+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:28.787349+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:29.787512+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:30.787700+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:31.787850+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:32.787984+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:33.788103+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:34.788270+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:35.788509+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:36.788695+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:37.788918+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:38.789100+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:39.789328+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:40.789499+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:41.789641+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:42.789906+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:43.790072+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:44.790247+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:45.790416+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:46.790547+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:47.790851+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:48.791014+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:49.791266+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:50.791498+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:51.791656+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:52.791856+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:53.792124+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:54.792333+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:55.792546+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:56.802896+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:57.803100+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73097216 unmapped: 24936448 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:58.803375+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73105408 unmapped: 24928256 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:59.803601+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73105408 unmapped: 24928256 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:00.803868+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73105408 unmapped: 24928256 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 ms_handle_reset con 0x555f1b858400 session 0x555f1b29dc20
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: handle_auth_request added challenge on 0x555f1b8eb400
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:01.804041+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:02.804221+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:03.804388+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:04.804819+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:05.805013+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:06.805192+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:07.805488+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:08.805708+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:09.805991+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:10.806188+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:11.806383+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:12.806573+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:13.806730+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:14.806865+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:15.807081+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:16.807354+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:17.807569+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:18.807743+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:19.807886+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:20.808084+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:21.808267+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:22.808474+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:23.808675+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:24.808836+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:25.809068+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:26.809260+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:27.809593+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:28.809749+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:29.809902+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:30.810136+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 24920064 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:31.810348+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:32.810539+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:33.810690+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:34.810826+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:35.811011+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:36.811238+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:37.811444+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:38.811705+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:39.811909+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:40.812130+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:41.812309+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:42.812482+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:43.812652+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:44.812870+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:45.813033+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:46.813200+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:47.813393+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:48.813600+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:49.813758+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:50.813979+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:51.814160+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:52.814343+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:53.814529+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:54.814671+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:55.814854+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:56.815048+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:57.815212+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:58.815347+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:59.815531+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:00.815940+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:01.816350+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:02.817154+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:03.817591+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:04.818385+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:05.818784+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:06.818954+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:07.819540+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 24911872 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:08.819731+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:09.820220+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:10.820742+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:11.821004+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:12.821394+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:13.821747+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:14.822072+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:15.822393+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:16.822758+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:17.823000+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:18.823187+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:19.823347+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:20.823680+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:21.823883+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:22.824186+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:23.824442+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:24.824848+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:25.825035+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:26.825275+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:27.825564+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:28.825791+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:29.826035+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:30.826332+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:31.826550+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:32.826725+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:33.826934+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:34.827124+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:35.827320+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:36.827542+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:37.827719+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73129984 unmapped: 24903680 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:38.827968+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:39.828189+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:40.828573+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:41.828737+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:42.828896+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:43.829092+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:44.829261+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:45.829458+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:46.829632+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:47.829875+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:48.830073+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:49.830214+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:50.830398+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:51.830548+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:52.830746+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:53.830944+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:54.831128+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:55.831268+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:56.831418+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:57.831570+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:58.831736+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:59.831871+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:00.832088+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:01.832482+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:02.832693+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:03.832906+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:04.833109+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:05.833350+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:06.833536+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:07.834043+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 24895488 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:08.834464+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:09.834860+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:10.835319+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:11.835626+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:12.835795+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:13.835954+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:14.836378+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:15.836751+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:16.836885+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:17.837159+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:18.837385+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:19.837698+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:20.837952+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:21.838097+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:22.838379+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:23.838516+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:24.838643+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:25.838782+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:26.838920+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:27.839065+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:28.839248+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:29.839374+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:30.839526+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 24887296 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:31.839660+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config show' '{prefix=config show}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 24690688 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:32.839849+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 24567808 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:33.840077+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:05 compute-0 ceph-osd[89052]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:05 compute-0 ceph-osd[89052]: bluestore.MempoolThread(0x555f19227b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 782826 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:05 compute-0 ceph-osd[89052]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 24535040 heap: 98033664 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: tick
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_tickets
Dec 01 09:48:05 compute-0 ceph-osd[89052]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:34.840268+0000)
Dec 01 09:48:05 compute-0 ceph-osd[89052]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fb2b0000/0x0/0x4ffc00000, data 0x18d9035/0x196d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,2] op hist [])
Dec 01 09:48:05 compute-0 ceph-osd[89052]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:48:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Dec 01 09:48:05 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4163474659' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Dec 01 09:48:05 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2242847727' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:48:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:48:05 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1127: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:05 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Dec 01 09:48:05 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3150262511' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Dec 01 09:48:06 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/25539967' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/840775449' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4163474659' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2242847727' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mon[75031]: pgmap v1127: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:06 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3150262511' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/25539967' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 09:48:06 compute-0 rsyslogd[1007]: imjournal from <np0005540741:ceph-osd>: begin to drop messages due to rate-limiting
Dec 01 09:48:06 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15134 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15136 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:06 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15138 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15140 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15142 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mon[75031]: from='client.15134 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mon[75031]: from='client.15136 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mon[75031]: from='client.15138 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15146 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15150 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Dec 01 09:48:07 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1795872441' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 09:48:07 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1128: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:07 compute-0 podman[279090]: 2025-12-01 09:48:07.981810827 +0000 UTC m=+0.075556082 container health_status 832582bc25aebe04ca9e0343b5a2b7afbca1792fe2a7c8967f2585969c8f643d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 01 09:48:08 compute-0 ceph-mon[75031]: from='client.15140 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mon[75031]: from='client.15142 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mon[75031]: from='client.15146 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mon[75031]: from='client.15150 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1795872441' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mon[75031]: pgmap v1128: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:08 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15153 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "versions"} v 0) v1
Dec 01 09:48:08 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/982398669' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15156 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:08 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Dec 01 09:48:08 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/466526555' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Dec 01 09:48:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/833397885' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: from='client.15153 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/982398669' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: from='client.15156 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/466526555' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/833397885' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 09:48:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 09:48:09 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump"} v 0) v1
Dec 01 09:48:09 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2133580005' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 09:48:09 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1129: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 1474560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:49.805396+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:50.805521+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:20.546636+0000 osd.0 (osd.0) 52 : cluster [DBG] 6.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:20.560699+0000 osd.0 (osd.0) 53 : cluster [DBG] 6.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368225 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 53) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:20.546636+0000 osd.0 (osd.0) 52 : cluster [DBG] 6.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:20.560699+0000 osd.0 (osd.0) 53 : cluster [DBG] 6.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:51.805765+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:21.552360+0000 osd.0 (osd.0) 54 : cluster [DBG] 5.1e scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:21.566317+0000 osd.0 (osd.0) 55 : cluster [DBG] 5.1e scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 55) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:21.552360+0000 osd.0 (osd.0) 54 : cluster [DBG] 5.1e scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:21.566317+0000 osd.0 (osd.0) 55 : cluster [DBG] 5.1e scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 1474560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:52.806005+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:22.548757+0000 osd.0 (osd.0) 56 : cluster [DBG] 2.19 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:22.562790+0000 osd.0 (osd.0) 57 : cluster [DBG] 2.19 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 57) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:22.548757+0000 osd.0 (osd.0) 56 : cluster [DBG] 2.19 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:22.562790+0000 osd.0 (osd.0) 57 : cluster [DBG] 2.19 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58236928 unmapped: 1474560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:53.806249+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:54.806550+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58245120 unmapped: 1466368 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:55.806746+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 370521 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.624811172s of 10.919664383s, submitted: 10
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58261504 unmapped: 1449984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:56.806923+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:26.529908+0000 osd.0 (osd.0) 58 : cluster [DBG] 2.18 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:26.544023+0000 osd.0 (osd.0) 59 : cluster [DBG] 2.18 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.16 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:57.807207+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 4 last_log 61 sent 59 num 4 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:27.530553+0000 osd.0 (osd.0) 60 : cluster [DBG] 2.16 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:27.544552+0000 osd.0 (osd.0) 61 : cluster [DBG] 2.16 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58294272 unmapped: 1417216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 59) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:26.529908+0000 osd.0 (osd.0) 58 : cluster [DBG] 2.18 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:26.544023+0000 osd.0 (osd.0) 59 : cluster [DBG] 2.18 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:58.807417+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 4 last_log 63 sent 61 num 4 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:28.509925+0000 osd.0 (osd.0) 62 : cluster [DBG] 2.13 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:28.524060+0000 osd.0 (osd.0) 63 : cluster [DBG] 2.13 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58302464 unmapped: 1409024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 61) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:27.530553+0000 osd.0 (osd.0) 60 : cluster [DBG] 2.16 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:27.544552+0000 osd.0 (osd.0) 61 : cluster [DBG] 2.16 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 63) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:28.509925+0000 osd.0 (osd.0) 62 : cluster [DBG] 2.13 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:28.524060+0000 osd.0 (osd.0) 63 : cluster [DBG] 2.13 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:15:59.807719+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 1400832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:00.807889+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373965 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58310656 unmapped: 1400832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:01.808093+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:02.808273+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:32.470612+0000 osd.0 (osd.0) 64 : cluster [DBG] 5.14 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:32.484512+0000 osd.0 (osd.0) 65 : cluster [DBG] 5.14 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58318848 unmapped: 1392640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.15 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 65) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:32.470612+0000 osd.0 (osd.0) 64 : cluster [DBG] 5.14 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:32.484512+0000 osd.0 (osd.0) 65 : cluster [DBG] 5.14 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:03.808549+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:33.464782+0000 osd.0 (osd.0) 66 : cluster [DBG] 5.15 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:33.478737+0000 osd.0 (osd.0) 67 : cluster [DBG] 5.15 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58327040 unmapped: 1384448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 67) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:33.464782+0000 osd.0 (osd.0) 66 : cluster [DBG] 5.15 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:33.478737+0000 osd.0 (osd.0) 67 : cluster [DBG] 5.15 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:04.808767+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:05.808920+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:35.379257+0000 osd.0 (osd.0) 68 : cluster [DBG] 2.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:35.393099+0000 osd.0 (osd.0) 69 : cluster [DBG] 2.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 377408 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58368000 unmapped: 1343488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 69) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:35.379257+0000 osd.0 (osd.0) 68 : cluster [DBG] 2.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:35.393099+0000 osd.0 (osd.0) 69 : cluster [DBG] 2.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:06.809150+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:07.809323+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.823574066s of 11.904456139s, submitted: 12
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58376192 unmapped: 1335296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:08.809476+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:38.434349+0000 osd.0 (osd.0) 70 : cluster [DBG] 5.7 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:38.448558+0000 osd.0 (osd.0) 71 : cluster [DBG] 5.7 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58384384 unmapped: 1327104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 71) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:38.434349+0000 osd.0 (osd.0) 70 : cluster [DBG] 5.7 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:38.448558+0000 osd.0 (osd.0) 71 : cluster [DBG] 5.7 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:09.809734+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:39.408865+0000 osd.0 (osd.0) 72 : cluster [DBG] 5.4 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:39.422853+0000 osd.0 (osd.0) 73 : cluster [DBG] 5.4 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58392576 unmapped: 1318912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 73) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:39.408865+0000 osd.0 (osd.0) 72 : cluster [DBG] 5.4 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:39.422853+0000 osd.0 (osd.0) 73 : cluster [DBG] 5.4 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:10.809978+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 379702 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:11.810164+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:41.455252+0000 osd.0 (osd.0) 74 : cluster [DBG] 2.11 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:41.469578+0000 osd.0 (osd.0) 75 : cluster [DBG] 2.11 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58400768 unmapped: 1310720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 75) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:41.455252+0000 osd.0 (osd.0) 74 : cluster [DBG] 2.11 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:41.469578+0000 osd.0 (osd.0) 75 : cluster [DBG] 2.11 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:12.810391+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 1302528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:13.810581+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:43.390779+0000 osd.0 (osd.0) 76 : cluster [DBG] 5.3 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:43.404338+0000 osd.0 (osd.0) 77 : cluster [DBG] 5.3 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58408960 unmapped: 1302528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 77) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:43.390779+0000 osd.0 (osd.0) 76 : cluster [DBG] 5.3 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:43.404338+0000 osd.0 (osd.0) 77 : cluster [DBG] 5.3 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:14.811112+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:15.811307+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:45.367464+0000 osd.0 (osd.0) 78 : cluster [DBG] 2.8 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:45.381502+0000 osd.0 (osd.0) 79 : cluster [DBG] 2.8 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 383144 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58425344 unmapped: 1286144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 79) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:45.367464+0000 osd.0 (osd.0) 78 : cluster [DBG] 2.8 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:45.381502+0000 osd.0 (osd.0) 79 : cluster [DBG] 2.8 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:16.811530+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 1277952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:17.811702+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58433536 unmapped: 1277952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:18.811864+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.929822922s of 10.976409912s, submitted: 10
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58441728 unmapped: 1269760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:19.812024+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:49.410835+0000 osd.0 (osd.0) 80 : cluster [DBG] 5.5 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:49.424838+0000 osd.0 (osd.0) 81 : cluster [DBG] 5.5 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 81) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:49.410835+0000 osd.0 (osd.0) 80 : cluster [DBG] 5.5 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:49.424838+0000 osd.0 (osd.0) 81 : cluster [DBG] 5.5 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58449920 unmapped: 1261568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:20.812331+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 384291 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:21.812586+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:51.399596+0000 osd.0 (osd.0) 82 : cluster [DBG] 2.b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:51.413727+0000 osd.0 (osd.0) 83 : cluster [DBG] 2.b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 83) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:51.399596+0000 osd.0 (osd.0) 82 : cluster [DBG] 2.b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:51.413727+0000 osd.0 (osd.0) 83 : cluster [DBG] 2.b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58466304 unmapped: 1245184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:22.812793+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:23.812962+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58474496 unmapped: 1236992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:24.813147+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:54.452981+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.2 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:54.467182+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.2 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 85) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:54.452981+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.2 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:54.467182+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.2 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:25.813448+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 386585 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58507264 unmapped: 1204224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:26.813624+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 1196032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:27.813808+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:57.399427+0000 osd.0 (osd.0) 86 : cluster [DBG] 2.1d scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:57.413428+0000 osd.0 (osd.0) 87 : cluster [DBG] 2.1d scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 87) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:57.399427+0000 osd.0 (osd.0) 86 : cluster [DBG] 2.1d scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:57.413428+0000 osd.0 (osd.0) 87 : cluster [DBG] 2.1d scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58515456 unmapped: 1196032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:28.814014+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58523648 unmapped: 1187840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:29.814160+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:59.374774+0000 osd.0 (osd.0) 88 : cluster [DBG] 2.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:16:59.388931+0000 osd.0 (osd.0) 89 : cluster [DBG] 2.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 89) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:59.374774+0000 osd.0 (osd.0) 88 : cluster [DBG] 2.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:16:59.388931+0000 osd.0 (osd.0) 89 : cluster [DBG] 2.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:30.814481+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 388881 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58540032 unmapped: 1171456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:31.814665+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.916635513s of 12.950966835s, submitted: 10
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:32.814881+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:02.361938+0000 osd.0 (osd.0) 90 : cluster [DBG] 2.1c scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:02.375966+0000 osd.0 (osd.0) 91 : cluster [DBG] 2.1c scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58548224 unmapped: 1163264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 91) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:02.361938+0000 osd.0 (osd.0) 90 : cluster [DBG] 2.1c scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:02.375966+0000 osd.0 (osd.0) 91 : cluster [DBG] 2.1c scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:33.815142+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:34.815368+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:35.815538+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 390029 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58556416 unmapped: 1155072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:36.815695+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58580992 unmapped: 1130496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:37.815857+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:07.374384+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.17 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:07.388451+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.17 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 93) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:07.374384+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.17 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:07.388451+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.17 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:38.816085+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:08.349892+0000 osd.0 (osd.0) 94 : cluster [DBG] 7.13 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:08.363839+0000 osd.0 (osd.0) 95 : cluster [DBG] 7.13 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 95) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:08.349892+0000 osd.0 (osd.0) 94 : cluster [DBG] 7.13 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:08.363839+0000 osd.0 (osd.0) 95 : cluster [DBG] 7.13 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58589184 unmapped: 1122304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:39.816356+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:40.816535+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 392325 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58597376 unmapped: 1114112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:41.816716+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58605568 unmapped: 1105920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:42.816875+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:12.305964+0000 osd.0 (osd.0) 96 : cluster [DBG] 3.15 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:12.319947+0000 osd.0 (osd.0) 97 : cluster [DBG] 3.15 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 97) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:12.305964+0000 osd.0 (osd.0) 96 : cluster [DBG] 3.15 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:12.319947+0000 osd.0 (osd.0) 97 : cluster [DBG] 3.15 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:43.817149+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58613760 unmapped: 1097728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:44.817385+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:45.817539+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 393473 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58638336 unmapped: 1073152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:46.817689+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:47.817854+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.937739372s of 15.965059280s, submitted: 8
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58646528 unmapped: 1064960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:48.818019+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:18.326960+0000 osd.0 (osd.0) 98 : cluster [DBG] 3.12 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:18.341083+0000 osd.0 (osd.0) 99 : cluster [DBG] 3.12 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 99) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:18.326960+0000 osd.0 (osd.0) 98 : cluster [DBG] 3.12 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:18.341083+0000 osd.0 (osd.0) 99 : cluster [DBG] 3.12 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 1048576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:49.818242+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:19.336348+0000 osd.0 (osd.0) 100 : cluster [DBG] 3.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:19.350308+0000 osd.0 (osd.0) 101 : cluster [DBG] 3.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 101) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:19.336348+0000 osd.0 (osd.0) 100 : cluster [DBG] 3.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:19.350308+0000 osd.0 (osd.0) 101 : cluster [DBG] 3.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58662912 unmapped: 1048576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:50.818679+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 395768 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:51.818896+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:52.819054+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58671104 unmapped: 1040384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:53.819237+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:23.269210+0000 osd.0 (osd.0) 102 : cluster [DBG] 7.9 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:23.283279+0000 osd.0 (osd.0) 103 : cluster [DBG] 7.9 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 103) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:23.269210+0000 osd.0 (osd.0) 102 : cluster [DBG] 7.9 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:23.283279+0000 osd.0 (osd.0) 103 : cluster [DBG] 7.9 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.c deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:54.819832+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:24.296815+0000 osd.0 (osd.0) 104 : cluster [DBG] 3.c deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:24.310896+0000 osd.0 (osd.0) 105 : cluster [DBG] 3.c deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 105) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:24.296815+0000 osd.0 (osd.0) 104 : cluster [DBG] 3.c deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:24.310896+0000 osd.0 (osd.0) 105 : cluster [DBG] 3.c deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58703872 unmapped: 1007616 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:55.820150+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 398062 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:56.820311+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:57.820462+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:27.332240+0000 osd.0 (osd.0) 106 : cluster [DBG] 7.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:27.346334+0000 osd.0 (osd.0) 107 : cluster [DBG] 7.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 107) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:27.332240+0000 osd.0 (osd.0) 106 : cluster [DBG] 7.f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:27.346334+0000 osd.0 (osd.0) 107 : cluster [DBG] 7.f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58720256 unmapped: 991232 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:58.820687+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.009760857s of 11.044724464s, submitted: 10
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:16:59.820871+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:29.371822+0000 osd.0 (osd.0) 108 : cluster [DBG] 7.6 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:29.385727+0000 osd.0 (osd.0) 109 : cluster [DBG] 7.6 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 109) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:29.371822+0000 osd.0 (osd.0) 108 : cluster [DBG] 7.6 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:29.385727+0000 osd.0 (osd.0) 109 : cluster [DBG] 7.6 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58736640 unmapped: 974848 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:00.821181+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 400356 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58744832 unmapped: 966656 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:01.821359+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.6 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:02.821507+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:32.408675+0000 osd.0 (osd.0) 110 : cluster [DBG] 3.6 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:32.422674+0000 osd.0 (osd.0) 111 : cluster [DBG] 3.6 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 111) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:32.408675+0000 osd.0 (osd.0) 110 : cluster [DBG] 3.6 deep-scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:32.422674+0000 osd.0 (osd.0) 111 : cluster [DBG] 3.6 deep-scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58761216 unmapped: 950272 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:03.821723+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:33.415355+0000 osd.0 (osd.0) 112 : cluster [DBG] 7.3 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:33.429358+0000 osd.0 (osd.0) 113 : cluster [DBG] 7.3 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15168 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 113) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:33.415355+0000 osd.0 (osd.0) 112 : cluster [DBG] 7.3 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:33.429358+0000 osd.0 (osd.0) 113 : cluster [DBG] 7.3 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58802176 unmapped: 909312 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:04.822116+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 115 sent 113 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:34.387045+0000 osd.0 (osd.0) 114 : cluster [DBG] 3.9 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:34.401217+0000 osd.0 (osd.0) 115 : cluster [DBG] 3.9 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 115) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:34.387045+0000 osd.0 (osd.0) 114 : cluster [DBG] 3.9 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:34.401217+0000 osd.0 (osd.0) 115 : cluster [DBG] 3.9 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.a scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58818560 unmapped: 892928 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:05.822362+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 117 sent 115 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:35.414919+0000 osd.0 (osd.0) 116 : cluster [DBG] 3.a scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:35.428947+0000 osd.0 (osd.0) 117 : cluster [DBG] 3.a scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 117) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:35.414919+0000 osd.0 (osd.0) 116 : cluster [DBG] 3.a scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:35.428947+0000 osd.0 (osd.0) 117 : cluster [DBG] 3.a scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 404944 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:06.822623+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:07.822823+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 119 sent 117 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:37.457522+0000 osd.0 (osd.0) 118 : cluster [DBG] 7.4 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:37.472254+0000 osd.0 (osd.0) 119 : cluster [DBG] 7.4 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 119) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:37.457522+0000 osd.0 (osd.0) 118 : cluster [DBG] 7.4 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:37.472254+0000 osd.0 (osd.0) 119 : cluster [DBG] 7.4 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58826752 unmapped: 884736 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:08.823354+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:09.823571+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58843136 unmapped: 868352 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:10.823772+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 406091 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:11.823935+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:12.824099+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58851328 unmapped: 860160 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:13.824283+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.002301216s of 15.046391487s, submitted: 12
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:14.824517+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 121 sent 119 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:44.418125+0000 osd.0 (osd.0) 120 : cluster [DBG] 3.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:44.432323+0000 osd.0 (osd.0) 121 : cluster [DBG] 3.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 121) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:44.418125+0000 osd.0 (osd.0) 120 : cluster [DBG] 3.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:44.432323+0000 osd.0 (osd.0) 121 : cluster [DBG] 3.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58859520 unmapped: 851968 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:15.824775+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 123 sent 121 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:45.428356+0000 osd.0 (osd.0) 122 : cluster [DBG] 7.18 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:45.442452+0000 osd.0 (osd.0) 123 : cluster [DBG] 7.18 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 408387 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 123) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:45.428356+0000 osd.0 (osd.0) 122 : cluster [DBG] 7.18 scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:45.442452+0000 osd.0 (osd.0) 123 : cluster [DBG] 7.18 scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58867712 unmapped: 843776 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:16.825143+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 125 sent 123 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:46.403953+0000 osd.0 (osd.0) 124 : cluster [DBG] 7.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:46.418078+0000 osd.0 (osd.0) 125 : cluster [DBG] 7.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 125) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:46.403953+0000 osd.0 (osd.0) 124 : cluster [DBG] 7.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:46.418078+0000 osd.0 (osd.0) 125 : cluster [DBG] 7.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58884096 unmapped: 827392 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:17.825531+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 127 sent 125 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:47.388794+0000 osd.0 (osd.0) 126 : cluster [DBG] 3.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:47.402875+0000 osd.0 (osd.0) 127 : cluster [DBG] 3.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 127) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:47.388794+0000 osd.0 (osd.0) 126 : cluster [DBG] 3.1f scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:47.402875+0000 osd.0 (osd.0) 127 : cluster [DBG] 3.1f scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58892288 unmapped: 819200 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:18.825848+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58908672 unmapped: 802816 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:19.825989+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  log_queue is 2 last_log 129 sent 127 num 2 unsent 2 sending 2
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:49.384350+0000 osd.0 (osd.0) 128 : cluster [DBG] 7.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  will send 2025-12-01T09:17:49.398826+0000 osd.0 (osd.0) 129 : cluster [DBG] 7.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client handle_log_ack log(last 129) v1
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:49.384350+0000 osd.0 (osd.0) 128 : cluster [DBG] 7.1b scrub starts
Dec 01 09:48:10 compute-0 ceph-osd[88047]: log_client  logged 2025-12-01T09:17:49.398826+0000 osd.0 (osd.0) 129 : cluster [DBG] 7.1b scrub ok
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58916864 unmapped: 794624 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:20.826188+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:21.826335+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58925056 unmapped: 786432 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:22.826476+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:23.826596+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58933248 unmapped: 778240 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:24.826781+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58957824 unmapped: 753664 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:25.827085+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:26.827539+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58966016 unmapped: 745472 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:27.827759+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:28.828149+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:29.828310+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58974208 unmapped: 737280 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:30.828452+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:31.828654+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58982400 unmapped: 729088 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:32.829193+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58990592 unmapped: 720896 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:33.829771+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:34.830129+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 58998784 unmapped: 712704 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:35.830466+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 704512 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:36.831076+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59015168 unmapped: 696320 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:37.831505+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:38.831871+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59023360 unmapped: 688128 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:39.832089+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:40.832519+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:41.832789+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:42.833208+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59031552 unmapped: 679936 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:43.833439+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:44.833729+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 663552 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:45.834473+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59064320 unmapped: 647168 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:46.834946+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:47.835371+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59072512 unmapped: 638976 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:48.835884+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:49.836348+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59080704 unmapped: 630784 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:50.836558+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:51.836764+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59088896 unmapped: 622592 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:52.836936+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:53.837495+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:54.837805+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59097088 unmapped: 614400 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:55.837953+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:56.838251+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59105280 unmapped: 606208 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:57.838411+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:58.838750+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:17:59.839032+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59113472 unmapped: 598016 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:00.839275+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:01.839606+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 581632 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:02.839835+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:03.840047+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59138048 unmapped: 573440 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:04.840266+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:05.840585+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59146240 unmapped: 565248 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:06.840741+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59154432 unmapped: 557056 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:07.841056+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:08.841405+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59162624 unmapped: 548864 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:09.841670+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:10.841929+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59170816 unmapped: 540672 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:11.842134+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:12.842273+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:13.842453+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:14.842712+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:15.842918+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:16.843069+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59179008 unmapped: 532480 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:17.843218+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:18.843445+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 524288 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:19.843687+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:20.843851+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59195392 unmapped: 516096 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:21.844007+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:22.844178+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:23.844373+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59203584 unmapped: 507904 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:24.844554+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:25.844806+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 491520 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:26.844985+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:27.845165+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:28.845416+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 483328 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:29.845567+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:30.845720+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59236352 unmapped: 475136 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:31.845893+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:32.846071+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59244544 unmapped: 466944 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:33.846210+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:34.846450+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59252736 unmapped: 458752 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:35.846644+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 450560 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:36.846994+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:37.847179+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59277312 unmapped: 434176 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:38.847408+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:39.847585+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59285504 unmapped: 425984 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:40.847934+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:41.848102+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:42.848348+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59301888 unmapped: 409600 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:43.848592+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:44.848802+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:45.848971+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59310080 unmapped: 401408 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:46.849166+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:47.849407+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59318272 unmapped: 393216 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:48.849710+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:49.849839+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59326464 unmapped: 385024 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:50.850006+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59334656 unmapped: 376832 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:51.850171+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:52.850335+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59342848 unmapped: 368640 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:53.850570+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:54.850796+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59351040 unmapped: 360448 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:55.851007+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:56.851240+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:57.851484+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59359232 unmapped: 352256 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:58.851658+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:18:59.851814+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:00.852007+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59367424 unmapped: 344064 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:01.852208+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:02.852415+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59375616 unmapped: 335872 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:03.852563+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59383808 unmapped: 327680 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:04.852770+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:05.852922+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59392000 unmapped: 319488 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:06.853136+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:07.853363+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59400192 unmapped: 311296 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:08.853615+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:09.853777+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59408384 unmapped: 303104 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:10.853992+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:11.854176+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:12.854361+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59416576 unmapped: 294912 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:13.854561+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:14.854771+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:15.854906+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59424768 unmapped: 286720 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:16.855059+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:17.855217+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59432960 unmapped: 278528 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:18.855388+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:19.855561+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:20.855728+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59441152 unmapped: 270336 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:21.855880+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:22.856074+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59449344 unmapped: 262144 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:23.856380+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:24.857126+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:25.857284+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59457536 unmapped: 253952 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:26.857473+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:27.857646+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:28.857814+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59465728 unmapped: 245760 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:29.857977+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:30.858188+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:31.858358+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59473920 unmapped: 237568 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:32.858584+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:33.858754+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59482112 unmapped: 229376 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:34.858935+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:35.859149+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:36.859413+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59490304 unmapped: 221184 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:37.859568+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:38.859702+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59498496 unmapped: 212992 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:39.859899+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:40.860046+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:41.860193+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59506688 unmapped: 204800 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:42.860519+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:43.860758+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59514880 unmapped: 196608 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:44.860949+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:45.861107+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:46.861285+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59523072 unmapped: 188416 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:47.861428+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:48.861570+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:49.861719+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59531264 unmapped: 180224 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:50.861910+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:51.862136+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59539456 unmapped: 172032 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:52.862309+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:53.862451+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59547648 unmapped: 163840 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:54.862689+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:55.862878+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:56.863056+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59555840 unmapped: 155648 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:57.863245+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:58.863486+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59564032 unmapped: 147456 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:19:59.863702+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:00.863863+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:01.864008+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59572224 unmapped: 139264 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:02.864153+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:03.864407+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59580416 unmapped: 131072 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:04.864608+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:05.864769+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:06.864932+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:07.865102+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:08.865408+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:09.865605+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59588608 unmapped: 122880 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:10.865840+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:11.866058+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:12.866361+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59596800 unmapped: 114688 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:13.866525+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:14.866719+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59604992 unmapped: 106496 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:15.866892+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:16.867070+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:17.867361+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59613184 unmapped: 98304 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:18.867532+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:19.867698+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59621376 unmapped: 90112 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:20.867890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:21.868058+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:22.868233+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59629568 unmapped: 81920 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:23.868541+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59637760 unmapped: 73728 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:24.868844+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:25.869136+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59645952 unmapped: 65536 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:26.869392+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:27.869683+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59654144 unmapped: 57344 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:28.869906+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:29.870090+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:30.870367+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59662336 unmapped: 49152 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:31.870645+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:32.870838+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:33.870991+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59670528 unmapped: 40960 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:34.871168+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:35.871326+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59678720 unmapped: 32768 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:36.871599+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:37.871847+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59686912 unmapped: 24576 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:38.872094+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:39.872366+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:40.872605+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59695104 unmapped: 16384 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:41.872813+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:42.872984+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59703296 unmapped: 8192 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:43.873226+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:44.873556+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:45.873696+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59711488 unmapped: 0 heap: 59711488 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:46.873920+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:47.874102+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:48.874255+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59719680 unmapped: 1040384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:49.874376+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:50.874609+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59727872 unmapped: 1032192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:51.874764+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:52.875142+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:53.875371+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59736064 unmapped: 1024000 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:54.875898+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:55.876055+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59744256 unmapped: 1015808 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:56.876210+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:57.876445+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:58.876646+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59752448 unmapped: 1007616 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:20:59.876886+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:00.877121+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59768832 unmapped: 991232 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:01.877399+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:02.877667+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:03.877889+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59777024 unmapped: 983040 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:04.878329+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:05.878670+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59785216 unmapped: 974848 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:06.879122+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:07.879508+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:08.879684+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59793408 unmapped: 966656 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:09.879819+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:10.879960+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59801600 unmapped: 958464 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:11.880121+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:12.880256+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:13.880388+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59809792 unmapped: 950272 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:14.880611+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:15.880835+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59817984 unmapped: 942080 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:16.881027+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:17.883666+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:18.886024+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59826176 unmapped: 933888 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:19.887976+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:20.888787+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59834368 unmapped: 925696 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:21.889761+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:22.889951+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:23.890140+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59842560 unmapped: 917504 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:24.890479+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:25.890619+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59850752 unmapped: 909312 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:26.890783+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:27.891071+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:28.891209+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59858944 unmapped: 901120 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:29.891451+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:30.891665+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59867136 unmapped: 892928 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:31.891814+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:32.891994+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:33.892211+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59875328 unmapped: 884736 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:34.892441+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:35.892617+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59883520 unmapped: 876544 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:36.892766+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:37.892929+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:38.893098+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59891712 unmapped: 868352 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:39.893255+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:40.893378+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:41.893701+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59899904 unmapped: 860160 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:42.893837+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:43.894086+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:44.894400+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59908096 unmapped: 851968 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:45.894632+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:46.894919+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:47.895030+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:48.895211+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59916288 unmapped: 843776 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:49.895407+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:50.895542+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59924480 unmapped: 835584 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:51.895743+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:52.895892+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:53.896059+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59932672 unmapped: 827392 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:54.896268+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:55.896450+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:56.896565+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59940864 unmapped: 819200 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:57.896707+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:58.896860+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59949056 unmapped: 811008 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:21:59.897016+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:00.897205+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:01.897419+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59957248 unmapped: 802816 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:02.897549+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:03.897710+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59965440 unmapped: 794624 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:04.897995+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:05.898172+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:06.898325+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59973632 unmapped: 786432 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:07.898456+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:08.898577+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59981824 unmapped: 778240 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:09.898713+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:10.898833+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59990016 unmapped: 770048 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:11.899016+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:12.899215+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:13.899471+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 59998208 unmapped: 761856 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:14.899730+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:15.899886+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:16.900058+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60006400 unmapped: 753664 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:17.900212+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:18.900457+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:19.900675+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60014592 unmapped: 745472 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:20.900804+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:21.900951+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60022784 unmapped: 737280 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:22.901133+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:23.901247+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:24.901339+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:25.901485+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:26.901616+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60030976 unmapped: 729088 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:27.901793+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:28.901922+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:29.902397+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60039168 unmapped: 720896 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:30.902532+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:31.902661+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60047360 unmapped: 712704 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:32.902752+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:33.902876+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:34.903053+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60055552 unmapped: 704512 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:35.903190+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:36.903317+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:37.903454+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60063744 unmapped: 696320 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:38.903598+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:39.903756+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60071936 unmapped: 688128 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:40.903925+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:41.904101+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60080128 unmapped: 679936 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:42.904220+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:43.904404+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:44.904592+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60088320 unmapped: 671744 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:45.904810+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:46.904921+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60096512 unmapped: 663552 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:47.904991+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:48.905140+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:49.905273+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60104704 unmapped: 655360 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:50.905498+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:51.905669+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:52.905832+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60112896 unmapped: 647168 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:53.906052+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:54.906216+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:55.906387+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60121088 unmapped: 638976 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:56.906572+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:57.906673+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60129280 unmapped: 630784 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:58.906830+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:22:59.907068+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60137472 unmapped: 622592 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:00.907248+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:01.907408+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:02.907623+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60145664 unmapped: 614400 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:03.907801+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:04.908018+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60153856 unmapped: 606208 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:05.908158+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:06.908334+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:07.908500+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60162048 unmapped: 598016 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:08.908640+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:09.908775+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:10.908981+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60170240 unmapped: 589824 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:11.909101+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:12.909248+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60178432 unmapped: 581632 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:13.909354+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:14.909548+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60186624 unmapped: 573440 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:15.909778+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60194816 unmapped: 565248 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:16.910500+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:17.910680+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:18.910871+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60203008 unmapped: 557056 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:19.911008+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:20.911154+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60211200 unmapped: 548864 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:21.911312+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:22.911467+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:23.911694+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60219392 unmapped: 540672 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:24.911888+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:25.912047+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:26.912200+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60227584 unmapped: 532480 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:27.912362+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:28.912490+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60235776 unmapped: 524288 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:29.912691+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:30.912847+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60243968 unmapped: 516096 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:31.912985+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:32.913132+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:33.913485+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60252160 unmapped: 507904 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:34.913751+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:35.913863+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60260352 unmapped: 499712 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:36.913980+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:37.914102+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:38.914266+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60268544 unmapped: 491520 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:39.914436+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60276736 unmapped: 483328 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 15.93 MB, 0.03 MB/s
                                           Interval WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:40.914568+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60334080 unmapped: 425984 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:41.914665+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60334080 unmapped: 425984 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:42.914834+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60342272 unmapped: 417792 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:43.914989+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60342272 unmapped: 417792 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:44.915159+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60342272 unmapped: 417792 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:45.915300+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60350464 unmapped: 409600 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:46.915413+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60350464 unmapped: 409600 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:47.915538+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60350464 unmapped: 409600 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:48.915684+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60358656 unmapped: 401408 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:49.915877+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60358656 unmapped: 401408 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:50.916132+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60366848 unmapped: 393216 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:51.916312+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60366848 unmapped: 393216 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:52.916536+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60375040 unmapped: 385024 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:53.916769+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:54.916944+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60391424 unmapped: 368640 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:55.917088+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:56.917251+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60399616 unmapped: 360448 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:57.917369+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:58.917519+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:23:59.917681+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60407808 unmapped: 352256 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:00.917841+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:01.918019+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:02.918129+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60416000 unmapped: 344064 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:03.918276+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:04.918521+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60424192 unmapped: 335872 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:05.918728+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:06.918925+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60432384 unmapped: 327680 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:07.919124+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:08.919328+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:09.919424+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60440576 unmapped: 319488 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:10.919540+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:11.919744+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60448768 unmapped: 311296 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:12.919890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:13.920429+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:14.920694+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60456960 unmapped: 303104 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:15.920810+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:16.921012+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:17.921155+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60465152 unmapped: 294912 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:18.921323+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:19.921461+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60473344 unmapped: 286720 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:20.921612+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:21.921770+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:22.921927+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60481536 unmapped: 278528 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:23.922096+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:24.922625+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:25.922817+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60489728 unmapped: 270336 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:26.922937+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:27.923130+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:28.923312+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60497920 unmapped: 262144 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:29.923453+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:30.923612+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60506112 unmapped: 253952 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:31.923835+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:32.923980+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60514304 unmapped: 245760 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:33.924150+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:34.924342+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:35.924478+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60522496 unmapped: 237568 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:36.924672+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:37.924823+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60530688 unmapped: 229376 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:38.924954+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:39.925086+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:40.925232+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:41.925372+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60538880 unmapped: 221184 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:42.925532+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:43.925683+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60547072 unmapped: 212992 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:44.925866+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:45.926022+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:46.926180+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60555264 unmapped: 204800 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:47.926367+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:48.926497+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:49.926638+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60563456 unmapped: 196608 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:50.926797+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:51.926921+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60571648 unmapped: 188416 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:52.927070+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:53.927275+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:54.927476+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60579840 unmapped: 180224 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:55.927685+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:56.927921+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:57.928076+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:58.928248+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:24:59.928434+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60588032 unmapped: 172032 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:00.928592+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:01.928796+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60596224 unmapped: 163840 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:02.929012+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:03.929182+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:04.929388+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60604416 unmapped: 155648 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:05.929529+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:06.929679+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60612608 unmapped: 147456 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:07.929834+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:08.929917+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:09.930077+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60620800 unmapped: 139264 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:10.930198+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:11.930363+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60628992 unmapped: 131072 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:12.930519+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:13.930666+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:14.930890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60637184 unmapped: 122880 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:15.931027+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:16.931214+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:17.931374+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60645376 unmapped: 114688 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:18.931526+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:19.931669+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60653568 unmapped: 106496 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:20.931790+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:21.931976+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:22.932122+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60661760 unmapped: 98304 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:23.932250+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:24.932432+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:25.932578+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:26.932727+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:27.932851+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:28.932962+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:29.933065+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:30.933173+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:31.933326+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:32.933465+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:33.933605+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:34.933812+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:35.933945+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60702720 unmapped: 57344 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:36.934119+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:37.934256+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60710912 unmapped: 49152 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:38.934409+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:39.934567+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:40.934752+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:41.934924+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:42.935105+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:43.935270+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:44.935577+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:45.935939+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:46.936237+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:47.936399+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:48.936608+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:49.936817+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:50.936999+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:51.937170+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:52.937421+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:53.937575+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:54.937767+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:55.937977+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:56.938150+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:57.938373+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:58.938545+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:25:59.938718+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:00.938879+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:01.939005+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:02.939186+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:03.939356+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:04.939514+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:05.939702+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:06.939930+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:07.940098+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:08.940220+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:09.940390+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:10.940535+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:11.940686+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:12.940837+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:13.941014+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:14.941190+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:15.941334+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:16.941456+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:17.941589+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:18.941711+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:19.941838+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:20.941975+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:21.942106+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:22.942246+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:23.942353+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:24.942532+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:25.942677+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:26.942838+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:27.943009+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:28.943154+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:29.943328+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:30.943446+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:31.943567+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:32.943683+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:33.943815+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:34.943977+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:35.944102+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:36.944226+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:37.944372+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:38.944487+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:39.944625+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:40.944777+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:41.944987+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:42.945145+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:43.945277+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:44.945424+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:45.945609+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:46.945758+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:47.945913+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:48.946035+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:49.946154+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:50.946310+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:51.946439+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:52.946566+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:53.946709+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:54.946868+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:55.947001+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:56.947114+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:57.947310+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:58.947441+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:26:59.947583+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:00.947764+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:01.954104+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:02.954261+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:03.954343+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:04.954520+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:05.954702+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:06.954890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:07.955090+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:08.955265+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:09.955451+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:10.955565+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:11.955781+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:12.955909+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:13.956035+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:14.956186+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:15.956330+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:16.956442+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:17.956632+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:18.956824+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:19.956969+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:20.957145+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:21.957376+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:22.957514+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:23.957638+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:24.957803+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:25.957932+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:26.958062+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:27.958201+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:28.958366+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:29.958559+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:30.958796+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:31.958969+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:32.959167+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:33.959367+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:34.959563+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:35.959705+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:36.959862+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:37.960104+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:38.960284+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:39.960441+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:40.960548+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:41.960714+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:42.960858+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:43.961028+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:44.961232+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:45.961376+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:46.961513+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:47.961694+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:48.961923+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:49.962070+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:50.962188+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:51.963098+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:52.963338+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:53.963830+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:54.964004+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:55.964144+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:56.964318+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:57.964540+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:58.964692+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:27:59.964876+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:00.965027+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:01.965208+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:02.965380+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:03.965507+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:04.965768+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:05.966024+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:06.966209+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:07.966434+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:08.966559+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:09.966729+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:10.966869+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:11.967037+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:12.967224+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:13.967429+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:14.967650+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:15.967889+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:16.968062+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:17.968189+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:18.968355+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:19.968574+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:20.968741+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:21.968914+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:22.969113+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:23.969257+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:24.969488+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:25.969644+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:26.969806+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:27.969927+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:28.970154+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:29.970410+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:30.970559+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:31.970744+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:32.971236+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:33.971369+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:34.971554+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:35.971741+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:36.971921+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:37.972117+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:38.972320+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60743680 unmapped: 16384 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:39.972518+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:40.972641+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 09:48:10 compute-0 ceph-mon[75031]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 09:48:10 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/2133580005' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 09:48:10 compute-0 ceph-mon[75031]: pgmap v1129: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:41.972821+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:42.972970+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:43.973097+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:44.973263+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:45.973418+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:46.973738+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:47.973896+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:48.974025+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60751872 unmapped: 8192 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:49.974164+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: mgrc ms_handle_reset ms_handle_reset con 0x55c737203c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3312476512
Dec 01 09:48:10 compute-0 ceph-osd[88047]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3312476512,v1:192.168.122.100:6801/3312476512]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: get_auth_request con 0x55c739d2a000 auth_method 0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: mgrc handle_mgr_configure stats_period=5
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:50.974620+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:51.974806+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:52.974952+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:53.975184+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:54.975394+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:55.975668+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:56.975817+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:57.976015+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 ms_handle_reset con 0x55c737443400 session 0x55c7371974a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900b400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:58.976197+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:28:59.976334+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:00.976453+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:01.976607+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:02.976813+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:03.977004+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:04.977179+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:05.977370+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:06.977505+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:07.977661+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:08.977803+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:09.977959+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:10.978086+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:11.978260+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:12.978417+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:13.978575+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:14.978779+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60760064 unmapped: 0 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:15.978992+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:16.979129+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:17.979323+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:18.979496+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:19.979713+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:20.979886+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:21.980214+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:22.980476+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:23.980640+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:24.980840+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:25.980995+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:26.981154+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:27.981334+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:28.981486+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:29.981744+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:30.981918+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:31.982052+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:32.982675+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:33.982799+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:34.983006+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:35.983138+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:36.983271+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:37.983410+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:38.983552+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:39.983705+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:40.983844+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:41.984018+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:42.984160+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:43.984325+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:44.984480+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:45.984617+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:46.984749+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:47.984857+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:48.984975+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:49.985083+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:50.985201+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:51.985392+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:52.985536+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:53.985858+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:54.986082+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:55.986983+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:56.987137+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:57.987318+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:58.987490+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:29:59.987694+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:00.987861+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 ms_handle_reset con 0x55c73900a000 session 0x55c7384cc960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900ac00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:01.988051+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:02.988194+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:03.988385+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:04.988588+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:05.988738+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:06.989032+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:07.989172+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:08.989386+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:09.989508+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:10.989650+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:11.989779+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:12.989939+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:13.990212+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:14.990377+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:15.990551+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:16.990684+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:17.990862+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:18.991063+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:19.991270+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:20.991508+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:21.991644+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:22.991787+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:23.991938+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:24.992131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:25.992261+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:26.992536+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:27.992830+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:28.992983+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:29.993121+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:30.993258+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:31.993353+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:32.993497+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:33.993637+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:34.993849+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:35.994007+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:36.994127+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:37.994264+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:38.994350+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:39.994514+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:40.994731+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:41.994976+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:42.995131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:43.995259+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:44.995429+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:45.995557+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:46.995692+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:47.995839+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:48.996015+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:49.996207+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:50.996405+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:51.996548+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:52.996670+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:53.996807+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60669952 unmapped: 90112 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:54.996987+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:55.997165+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:56.997359+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:57.998869+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:58.999121+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:30:59.999373+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:00.999489+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:01.999609+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:02.999825+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:03.999965+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:05.000708+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:06.000842+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:07.001067+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:08.001577+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:09.001888+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:10.002027+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:11.002223+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:12.002417+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:13.002812+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:14.003037+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:15.003215+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:16.003688+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:17.003868+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:18.004023+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:19.004179+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:20.004372+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:21.004494+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:22.004633+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:23.004757+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:24.004883+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:25.005073+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:26.005233+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:27.005452+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:28.005619+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:29.005805+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:30.005974+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:31.006077+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:32.012580+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:33.012713+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:34.012823+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:35.013018+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:36.013147+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:37.013303+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:38.013457+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:39.013615+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:40.013777+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:41.013932+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:42.014073+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:43.014269+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:44.014417+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:45.014582+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:46.014758+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:47.014873+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:48.015003+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:49.015166+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:50.015346+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:51.015487+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:52.015639+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:53.015771+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:54.015890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:55.016080+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:56.016254+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:57.016441+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:58.016618+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:31:59.016746+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:00.016920+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:01.017082+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:02.017385+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:03.017909+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:04.018111+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:05.018332+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:06.018553+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:07.018758+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:08.018917+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:09.019116+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:10.019312+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:11.019476+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:12.019632+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:13.019809+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:14.019941+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:15.020097+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:16.020431+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:17.020638+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:18.020890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:19.021176+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:20.021357+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:21.021573+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:22.021861+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:23.022136+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:24.022377+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60678144 unmapped: 81920 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:25.022752+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:26.023068+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:27.023388+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:28.023965+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:29.024277+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:30.024590+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:31.024808+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:32.025022+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:33.025279+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:34.025683+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:35.026118+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:36.026363+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:37.026569+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:38.026760+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:39.027041+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:40.027330+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:41.027515+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:42.027844+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:43.028057+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:44.028161+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:45.028363+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:46.028499+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:47.028614+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:48.028794+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:49.028975+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:50.029116+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:51.029325+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:52.029437+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:53.029575+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:54.029691+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:55.029858+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:56.029967+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:57.030128+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:58.030282+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:32:59.030429+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:00.030580+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:01.030769+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:02.030916+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:03.031063+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:04.031225+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:05.031569+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:06.031783+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:07.032002+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:08.032419+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:09.032649+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:10.032999+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:11.033222+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:12.033399+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:13.033574+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:14.033763+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:15.033986+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:16.034208+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:17.034370+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:18.034499+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:19.034604+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:20.034889+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:21.035023+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:22.035156+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:23.035368+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:24.035980+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:25.036178+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:26.036370+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:27.036571+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:28.036739+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:29.036898+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60694528 unmapped: 65536 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:30.037102+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:31.037364+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:32.037526+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:33.037664+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:34.037839+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:35.038047+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:36.038245+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:37.038402+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:38.038566+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:39.038681+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60686336 unmapped: 73728 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:40.038897+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4208 writes, 19K keys, 4208 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4208 writes, 369 syncs, 11.40 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b3090#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c7363b31f0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:41.039078+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:42.039242+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:43.039423+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:44.039607+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:45.039878+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:46.040192+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:47.040383+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:48.040579+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:49.040812+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:50.041018+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:51.041237+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:52.041499+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:53.041735+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:54.041897+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:55.042083+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:56.042232+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:57.042449+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:58.042632+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:33:59.042791+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:00.042965+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:01.043095+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:02.043359+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:03.043508+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:04.043736+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:05.043906+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:06.044061+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:07.044187+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:08.044367+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:09.044567+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:10.044855+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:11.045529+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:12.045680+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:13.045840+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:14.045995+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:15.046537+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:16.046700+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:17.046894+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:18.047079+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:19.047375+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:20.047573+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:21.047765+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:22.047964+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:23.048138+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:24.048332+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:25.048540+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:26.048684+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:27.048828+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:28.048980+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:29.049109+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:30.049217+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:31.049353+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60719104 unmapped: 40960 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:32.049492+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:33.049649+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:34.049833+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:35.050033+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:36.050203+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:37.050405+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:38.050612+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:39.050775+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:40.050945+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:41.051095+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:42.051243+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:43.051425+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:44.051549+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:45.051715+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:46.051888+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:47.052036+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:48.052171+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:49.052488+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:50.052663+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:51.052835+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:52.053045+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:53.053180+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:54.053358+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:55.053569+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:56.053726+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:57.053884+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:58.054054+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:34:59.054269+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:00.054474+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:01.054639+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:02.054803+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:03.054991+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:04.055137+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:05.055341+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:06.055492+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:07.055720+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:08.055914+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:09.056146+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:10.056418+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:11.056595+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:12.056730+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:13.056851+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:14.057019+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:15.057203+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:16.057398+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:17.057584+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:18.057872+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:19.058143+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 24576 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:20.058406+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:21.058548+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:22.058680+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:23.058874+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:24.059090+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:25.059389+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:26.059565+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:27.059816+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:28.060001+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:29.060146+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:30.060357+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:31.060507+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:32.060685+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:33.060874+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:34.061020+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:35.061209+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:36.061351+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:37.061623+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:38.061767+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:39.061908+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:40.062073+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:41.062233+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:42.062377+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:43.062514+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:44.062654+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:45.062866+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 heartbeat osd_stat(store_statfs(0x4fe159000/0x0/0x4ffc00000, data 0x2da0e/0x75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:46.063077+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411831 data_alloc: 218103808 data_used: 8192
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:47.063225+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:48.063402+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:49.063571+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 47 handle_osd_map epochs [48,48], i have 47, src has [1,48]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 1115.405761719s of 1115.446777344s, submitted: 10
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:50.063758+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60727296 unmapped: 32768 heap: 60760064 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 48 heartbeat osd_stat(store_statfs(0x4fe155000/0x0/0x4ffc00000, data 0x2efc8/0x78000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 48 handle_osd_map epochs [49,49], i have 48, src has [1,49]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 48 handle_osd_map epochs [49,49], i have 49, src has [1,49]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:51.063926+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 418975 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60735488 unmapped: 1073152 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 49 handle_osd_map epochs [50,50], i have 49, src has [1,50]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 50 ms_handle_reset con 0x55c73900a000 session 0x55c739057a40
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:52.064098+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60825600 unmapped: 983040 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:53.064303+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60825600 unmapped: 983040 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 50 handle_osd_map epochs [50,51], i have 50, src has [1,51]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 51 ms_handle_reset con 0x55c73900a400 session 0x55c738b12960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 51 heartbeat osd_stat(store_statfs(0x4fe14c000/0x0/0x4ffc00000, data 0x31bdf/0x80000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:54.064474+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:55.065374+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:56.065591+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 430013 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:57.065775+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:58.065944+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 51 heartbeat osd_stat(store_statfs(0x4fe149000/0x0/0x4ffc00000, data 0x331d8/0x84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:35:59.066139+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60932096 unmapped: 876544 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 51 heartbeat osd_stat(store_statfs(0x4fe149000/0x0/0x4ffc00000, data 0x331d8/0x84000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:00.066359+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 51 handle_osd_map epochs [51,52], i have 51, src has [1,52]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.248213768s of 10.329307556s, submitted: 15
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60948480 unmapped: 860160 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:01.066559+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60948480 unmapped: 860160 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:02.066773+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60948480 unmapped: 860160 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:03.066969+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:04.067143+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:05.067414+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:06.067603+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:07.067986+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:08.068225+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:09.068419+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:10.068645+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:11.068791+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:12.068983+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:13.069140+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:14.069313+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:15.069466+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:16.069636+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:17.069791+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:18.070013+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:19.070203+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:20.071149+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:21.071906+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 432809 data_alloc: 218103808 data_used: 16384
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:22.072099+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:23.072264+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:24.072697+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 52 heartbeat osd_stat(store_statfs(0x4fe146000/0x0/0x4ffc00000, data 0x34678/0x87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:25.073054+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 60981248 unmapped: 827392 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.547399521s of 25.560478210s, submitted: 9
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:26.073374+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 52 handle_osd_map epochs [52,53], i have 52, src has [1,53]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 439721 data_alloc: 218103808 data_used: 24576
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 53 ms_handle_reset con 0x55c737c7bc00 session 0x55c738b12000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61038592 unmapped: 770048 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:27.073590+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63800
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 53 ms_handle_reset con 0x55c739d63800 session 0x55c738b13a40
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 53 ms_handle_reset con 0x55c739d63c00 session 0x55c738d850e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61087744 unmapped: 720896 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:28.073799+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 53 heartbeat osd_stat(store_statfs(0x4fe141000/0x0/0x4ffc00000, data 0x36055/0x8c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61087744 unmapped: 720896 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:29.073996+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 53 handle_osd_map epochs [54,54], i have 53, src has [1,54]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 54 ms_handle_reset con 0x55c737c7bc00 session 0x55c738d84000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61333504 unmapped: 475136 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 54 ms_handle_reset con 0x55c73900a000 session 0x55c738adbe00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 54 ms_handle_reset con 0x55c73900a400 session 0x55c738ada780
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:30.074160+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63800
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 61358080 unmapped: 450560 heap: 61808640 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:31.074361+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 54 heartbeat osd_stat(store_statfs(0x4fe13b000/0x0/0x4ffc00000, data 0x37a55/0x92000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 54 handle_osd_map epochs [55,55], i have 54, src has [1,55]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 450304 data_alloc: 218103808 data_used: 40960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 55 ms_handle_reset con 0x55c739d63800 session 0x55c737196f00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62545920 unmapped: 311296 heap: 62857216 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:32.074609+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db2800
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 55 heartbeat osd_stat(store_statfs(0x4fe138000/0x0/0x4ffc00000, data 0x39040/0x94000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62554112 unmapped: 303104 heap: 62857216 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:33.075217+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 56 ms_handle_reset con 0x55c739db2800 session 0x55c738612960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62513152 unmapped: 1392640 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:34.075583+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 56 heartbeat osd_stat(store_statfs(0x4fe13a000/0x0/0x4ffc00000, data 0x39e08/0x93000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62513152 unmapped: 1392640 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:35.075760+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 56 handle_osd_map epochs [57,57], i have 56, src has [1,57]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 57 ms_handle_reset con 0x55c737c7bc00 session 0x55c738648b40
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62545920 unmapped: 1359872 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:36.075912+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 458059 data_alloc: 218103808 data_used: 40960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62562304 unmapped: 1343488 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:37.076072+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.251831055s of 11.507549286s, submitted: 61
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 57 ms_handle_reset con 0x55c73900a400 session 0x55c7386481e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62603264 unmapped: 1302528 heap: 63905792 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 57 handle_osd_map epochs [58,58], i have 57, src has [1,58]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:38.076201+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739d63800
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 58 handle_osd_map epochs [59,59], i have 58, src has [1,59]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 59 ms_handle_reset con 0x55c73900a000 session 0x55c738adbe00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 62996480 unmapped: 17694720 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:39.076542+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 59 heartbeat osd_stat(store_statfs(0x4fc92c000/0x0/0x4ffc00000, data 0x183e90e/0x18a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 59 handle_osd_map epochs [59,60], i have 59, src has [1,60]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 59 handle_osd_map epochs [60,60], i have 60, src has [1,60]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 60 ms_handle_reset con 0x55c739db3c00 session 0x55c738ada960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 60 ms_handle_reset con 0x55c739d63800 session 0x55c7371974a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63070208 unmapped: 17620992 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 60 heartbeat osd_stat(store_statfs(0x4fc928000/0x0/0x4ffc00000, data 0x183ff1a/0x18a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:40.076694+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 60 handle_osd_map epochs [61,61], i have 60, src has [1,61]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 61 ms_handle_reset con 0x55c737c7bc00 session 0x55c739090000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63135744 unmapped: 17555456 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:41.076863+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 61 ms_handle_reset con 0x55c73900a400 session 0x55c7390910e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 649667 data_alloc: 218103808 data_used: 57344
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 62 ms_handle_reset con 0x55c73900a000 session 0x55c7389ad680
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63184896 unmapped: 17506304 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3800
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:42.077056+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 63 ms_handle_reset con 0x55c739db3c00 session 0x55c7390905a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 63 ms_handle_reset con 0x55c739db3800 session 0x55c7389ac3c0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63225856 unmapped: 17465344 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:43.077203+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 63 heartbeat osd_stat(store_statfs(0x4fe11e000/0x0/0x4ffc00000, data 0x4438b/0xaf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 63 handle_osd_map epochs [64,64], i have 64, src has [1,64]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63234048 unmapped: 17457152 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:44.077346+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 64 heartbeat osd_stat(store_statfs(0x4fe11a000/0x0/0x4ffc00000, data 0x45971/0xb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 64 handle_osd_map epochs [65,65], i have 65, src has [1,65]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63275008 unmapped: 17416192 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:45.077510+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 65 handle_osd_map epochs [65,66], i have 65, src has [1,66]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 66 heartbeat osd_stat(store_statfs(0x4fe116000/0x0/0x4ffc00000, data 0x46fab/0xb5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 66 ms_handle_reset con 0x55c737c7bc00 session 0x55c739090d20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 17375232 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:46.077668+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 499959 data_alloc: 218103808 data_used: 122880
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 67 ms_handle_reset con 0x55c73900a000 session 0x55c7389c92c0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63348736 unmapped: 17342464 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:47.077808+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.564005852s of 10.165904045s, submitted: 115
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63397888 unmapped: 17293312 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 68 ms_handle_reset con 0x55c73900a400 session 0x55c7389a6d20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:48.078011+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 69 ms_handle_reset con 0x55c739db3c00 session 0x55c738ada5a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 69 ms_handle_reset con 0x55c739db3400 session 0x55c7389a74a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 17162240 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:49.078258+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 70 ms_handle_reset con 0x55c737c7bc00 session 0x55c738ada780
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 70 ms_handle_reset con 0x55c73900a000 session 0x55c7371974a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 15974400 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:50.078495+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 70 handle_osd_map epochs [70,71], i have 70, src has [1,71]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 71 ms_handle_reset con 0x55c73900a400 session 0x55c7384d9c20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fe10a000/0x0/0x4ffc00000, data 0x4ea1e/0xc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:51.078685+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fe10a000/0x0/0x4ffc00000, data 0x4ea1e/0xc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 513448 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:52.078854+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:53.079000+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64782336 unmapped: 15908864 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:54.079143+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fe10a000/0x0/0x4ffc00000, data 0x4ea1e/0xc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 71 handle_osd_map epochs [71,72], i have 71, src has [1,72]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 64806912 unmapped: 15884288 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:55.079331+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 72 handle_osd_map epochs [73,73], i have 72, src has [1,73]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 73 ms_handle_reset con 0x55c739db3c00 session 0x55c7389a70e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 13737984 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:56.079450+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 521811 data_alloc: 218103808 data_used: 131072
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 13737984 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:57.079631+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 13721600 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:58.079779+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 13721600 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 73 heartbeat osd_stat(store_statfs(0x4fe105000/0x0/0x4ffc00000, data 0x51719/0xc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1a2f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:36:59.079934+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 73 handle_osd_map epochs [73,74], i have 73, src has [1,74]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.513625145s of 12.047250748s, submitted: 150
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 13713408 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:00.080067+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c739db3000 session 0x55c7389a7680
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 13828096 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:01.080274+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 529152 data_alloc: 218103808 data_used: 143360
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c737c7bc00 session 0x55c7389a7e00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 13819904 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:02.080519+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c73900a000 session 0x55c738649e00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 75 ms_handle_reset con 0x55c73900a400 session 0x55c738649860
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 13819904 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:03.080677+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 13729792 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 77 ms_handle_reset con 0x55c739db3c00 session 0x55c738649680
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:04.080815+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 77 handle_osd_map epochs [77,78], i have 77, src has [1,78]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db2c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 78 ms_handle_reset con 0x55c739db2c00 session 0x55c737197860
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67018752 unmapped: 13672448 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 78 heartbeat osd_stat(store_statfs(0x4fdce9000/0x0/0x4ffc00000, data 0x56f89/0xd3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:05.081035+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 13623296 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:06.081177+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546256 data_alloc: 218103808 data_used: 143360
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 13623296 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:07.081339+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 heartbeat osd_stat(store_statfs(0x4fdce0000/0x0/0x4ffc00000, data 0x59a92/0xda000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 13623296 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c737c7bc00 session 0x55c739090d20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a000 session 0x55c737a0c780
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a400 session 0x55c7389ac5a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:08.081584+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db2c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739db2c00 session 0x55c7384cda40
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e17400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739e17400 session 0x55c7386130e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739db3c00 session 0x55c7384cd680
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c737c7bc00 session 0x55c7384cc000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739000400 session 0x55c7390901e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a400 session 0x55c7390910e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 13598720 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c73900a000 session 0x55c738a9bc20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:09.081764+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c737c7bc00 session 0x55c738a9b4a0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 13598720 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 ms_handle_reset con 0x55c739000400 session 0x55c738a9be00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:10.081978+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.614140511s of 10.876181602s, submitted: 69
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 80 ms_handle_reset con 0x55c73900a400 session 0x55c7384d9c20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 80 ms_handle_reset con 0x55c739db3c00 session 0x55c7384d8960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 13500416 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:11.082101+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16800
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 552791 data_alloc: 218103808 data_used: 159744
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 80 heartbeat osd_stat(store_statfs(0x4fdcde000/0x0/0x4ffc00000, data 0x5af8a/0xdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:12.082219+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:13.082363+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:14.082533+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 80 heartbeat osd_stat(store_statfs(0x4fdcde000/0x0/0x4ffc00000, data 0x5af8a/0xdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 13426688 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:15.082742+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 80 ms_handle_reset con 0x55c739e16000 session 0x55c7384d92c0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 13336576 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:16.082915+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 80 handle_osd_map epochs [80,81], i have 80, src has [1,81]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c739e16000 session 0x55c7384d9c20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c739000400 session 0x55c738d84f00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c737c7bc00 session 0x55c7384d92c0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c73900a400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 81 ms_handle_reset con 0x55c73900a400 session 0x55c738a9bc20
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 559232 data_alloc: 218103808 data_used: 159744
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:17.083231+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 81 handle_osd_map epochs [81,82], i have 81, src has [1,82]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 82 ms_handle_reset con 0x55c739db3c00 session 0x55c738a9be00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 82 heartbeat osd_stat(store_statfs(0x4fdcd9000/0x0/0x4ffc00000, data 0x5c586/0xe3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 13189120 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:18.083424+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 82 handle_osd_map epochs [82,83], i have 82, src has [1,83]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 83 ms_handle_reset con 0x55c739db3c00 session 0x55c7390910e0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:19.083597+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:20.083794+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 83 heartbeat osd_stat(store_statfs(0x4fdcd4000/0x0/0x4ffc00000, data 0x5f16a/0xe9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 13164544 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.453618050s of 10.603578568s, submitted: 54
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 83 ms_handle_reset con 0x55c737c7bc00 session 0x55c7384cc000
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:21.083940+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 562680 data_alloc: 218103808 data_used: 163840
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67567616 unmapped: 13123584 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:22.084131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 83 handle_osd_map epochs [83,84], i have 83, src has [1,84]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 84 ms_handle_reset con 0x55c739000400 session 0x55c738afd2c0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 13017088 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:23.084319+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 84 heartbeat osd_stat(store_statfs(0x4fdcd1000/0x0/0x4ffc00000, data 0x60752/0xeb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1e3f9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 84 ms_handle_reset con 0x55c739e16800 session 0x55c7389a4960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 84 ms_handle_reset con 0x55c739e16400 session 0x55c73798a960
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 13017088 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:24.084530+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 13017088 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:25.084728+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 13000704 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:26.084866+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 85 handle_osd_map epochs [85,86], i have 85, src has [1,86]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 86 ms_handle_reset con 0x55c737c7bc00 session 0x55c738d85e00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 569627 data_alloc: 218103808 data_used: 172032
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 12877824 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:27.085046+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 12877824 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:28.085226+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 86 heartbeat osd_stat(store_statfs(0x4fcb2e000/0x0/0x4ffc00000, data 0x631d9/0xee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 12877824 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:29.085394+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 86 ms_handle_reset con 0x55c739000400 session 0x55c738b12b40
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 12828672 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:30.085535+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 12828672 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:31.085705+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739db3c00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.347141266s of 10.605584145s, submitted: 81
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 87 ms_handle_reset con 0x55c739db3c00 session 0x55c738af8b40
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739e16800
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 571375 data_alloc: 218103808 data_used: 176128
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67903488 unmapped: 12787712 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:32.085843+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _renew_subs
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 88 ms_handle_reset con 0x55c739e16800 session 0x55c738af9e00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:33.086027+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 88 heartbeat osd_stat(store_statfs(0x4fcb29000/0x0/0x4ffc00000, data 0x65c90/0xf3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 12746752 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:34.086208+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 12746752 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:35.086352+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:36.086474+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 578765 data_alloc: 218103808 data_used: 176128
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:37.086654+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:38.086856+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:39.087014+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 89 heartbeat osd_stat(store_statfs(0x4fcb27000/0x0/0x4ffc00000, data 0x6714c/0xf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:40.087181+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:41.087347+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 89 heartbeat osd_stat(store_statfs(0x4fcb27000/0x0/0x4ffc00000, data 0x6714c/0xf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 578765 data_alloc: 218103808 data_used: 176128
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:42.087503+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.186514854s of 11.261335373s, submitted: 63
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 12746752 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:43.087636+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 12738560 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:44.087783+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 12738560 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:45.087962+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:46.088148+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:47.088312+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:48.088440+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:49.088581+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:50.088733+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:51.088870+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 12730368 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:52.088993+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:53.089115+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:54.089340+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:55.089530+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:56.089667+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:57.089840+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:58.090000+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:37:59.090186+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:00.090395+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:01.090571+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:02.090710+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:03.090866+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:04.091012+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:05.091216+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:06.091350+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:07.091566+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:08.091769+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:09.091990+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:10.092186+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:11.092377+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:12.092593+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:13.092782+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:14.092970+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:15.093256+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:16.093538+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:17.093783+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:18.093941+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:19.094075+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:20.094273+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:21.094481+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:22.094728+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:23.094988+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:24.095227+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:25.095565+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:26.095765+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:27.096006+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:28.096203+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:29.096423+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:30.097624+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 12779520 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:31.100636+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:32.100855+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:33.101136+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:34.103687+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:35.105034+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:36.105503+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:37.105724+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:38.106813+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:39.107732+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:40.108499+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:41.108762+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:42.108927+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:43.109167+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:44.109385+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:45.109630+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:46.109801+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:47.109989+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:48.110327+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:49.110592+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:50.110803+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:51.110992+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:52.111126+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:53.111318+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:54.111536+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:55.111816+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:56.111978+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:57.112150+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:58.112317+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:38:59.112629+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:00.112852+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:01.113271+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:02.113455+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:03.113602+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 12771328 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:04.113767+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 12648448 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config show' '{prefix=config show}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:05.114064+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 12165120 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:06.118605+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 12165120 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:07.118727+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 12410880 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'perf dump' '{prefix=perf dump}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'perf schema' '{prefix=perf schema}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:08.118890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:09.119035+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:10.119188+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:11.119320+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:12.119467+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:13.119618+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:14.119792+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:15.119986+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:16.120130+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:17.120281+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:18.120443+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:19.120578+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:20.120717+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:21.120855+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:22.120996+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:23.121131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:24.121271+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:25.121516+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:26.121658+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:27.121946+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:28.122075+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:29.122226+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:30.122367+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:31.122511+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:32.122701+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:33.122879+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:34.123072+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:35.123324+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:36.123514+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:37.123664+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:38.123847+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:39.124022+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:40.124160+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:41.124330+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:42.124484+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:43.124668+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:44.124880+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:45.125072+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:46.125240+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:47.125422+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:48.125550+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:49.125672+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:50.125809+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:51.125966+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:52.126125+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:53.126395+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:54.126702+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:55.126919+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:56.127170+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:57.127375+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:58.127540+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:39:59.127714+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:00.127853+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:01.128024+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:02.128157+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:03.128360+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:04.128491+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:05.128743+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:06.128999+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:07.129139+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:08.129272+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:09.129460+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:10.129632+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:11.129866+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:12.130125+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:13.130443+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:14.130644+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:15.130938+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:16.131189+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:17.131453+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:18.131836+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:19.132232+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:20.132440+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:21.132620+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:22.132797+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:23.133070+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:24.133421+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:25.133752+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:26.133987+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:27.134235+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:28.134447+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:29.134652+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:30.134834+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:31.134982+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:32.135130+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:33.135267+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:34.135456+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:35.135670+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:36.135824+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:37.135956+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:38.136136+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:39.136355+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:40.136520+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:41.136704+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:42.136971+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:43.137180+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:44.137367+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:45.137619+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:46.137816+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:47.137978+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:48.138174+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:49.138374+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:50.138510+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:51.138668+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:52.138804+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:53.138986+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:54.139137+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:55.139359+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:56.139557+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:57.139745+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:58.139886+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:40:59.140047+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:00.140215+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:01.140392+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:02.140568+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:03.140765+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:04.140922+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:05.141145+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:06.141354+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:07.141513+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:08.141739+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:09.141899+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:10.142043+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:11.142201+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:12.142355+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:13.142502+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:14.142666+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:15.142888+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:16.143078+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:17.143242+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:18.143354+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:19.143481+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:20.143725+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:21.143890+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:22.144050+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:23.144228+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:24.144390+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:25.144640+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:26.144819+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:27.145035+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:28.145176+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:29.145376+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:30.145562+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:31.145736+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:32.145924+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:33.146131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:34.146407+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:35.146715+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:36.146893+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:37.147153+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:38.147362+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:39.147541+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:40.147739+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:41.147872+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:42.148036+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:43.148238+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:44.148391+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:45.148571+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:46.148719+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:47.148881+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:48.149027+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:49.149202+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:50.149365+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:51.149545+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:52.149713+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:53.149942+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:54.150111+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:55.150270+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:56.150666+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:57.150932+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:58.151189+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:41:59.151369+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:00.151530+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:01.151708+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:02.151855+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:03.151981+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:04.152137+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:05.152714+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:06.152912+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:07.153133+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:08.153283+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:09.153471+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:10.153602+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:11.153757+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:12.153944+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:13.154131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:14.154354+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:15.154544+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:16.154711+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:17.154855+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:18.155075+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:19.155328+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:20.155524+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:21.155747+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:22.155952+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:23.156122+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:24.156252+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:25.156474+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:26.156592+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:27.156715+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:28.156847+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:29.157005+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:30.157169+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:31.157342+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:32.157543+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:33.157697+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:34.157955+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:35.158207+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:36.158417+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:37.158632+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:38.158848+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:39.159068+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:40.159197+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:41.159365+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:42.159586+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:43.159766+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:44.159935+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:45.160160+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:46.160349+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:47.160487+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:48.160634+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:49.160845+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:50.161049+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:51.161209+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:52.161365+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:53.161626+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:54.161829+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:55.162070+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:56.162267+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:57.162504+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:58.162675+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:42:59.162864+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:00.163087+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:01.163345+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:02.163504+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:03.163678+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:04.163920+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:05.164245+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:06.164428+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:07.164662+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:08.164821+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:09.165007+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:10.165346+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:11.165543+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:12.165727+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:13.165858+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:14.166008+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:15.166201+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:16.166678+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:17.166860+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:18.167106+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:19.167346+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:20.167557+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:21.167751+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:22.167914+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:23.168082+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:24.168344+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:25.168566+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:26.168703+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:27.168879+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:28.169048+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:29.169240+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:30.169494+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:31.169632+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:32.169788+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:33.170000+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:34.170283+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:35.170595+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:36.170824+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:37.171131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:38.171372+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:39.171553+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:40.171762+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 5306 writes, 22K keys, 5306 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5306 writes, 841 syncs, 6.31 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1098 writes, 3074 keys, 1098 commit groups, 1.0 writes per commit group, ingest: 1.84 MB, 0.00 MB/s
                                           Interval WAL: 1098 writes, 472 syncs, 2.33 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:41.171954+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:42.172189+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 11943936 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:43.172345+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:44.172505+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:45.172771+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:46.172902+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:47.173115+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:48.173282+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:49.173498+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:50.173698+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:51.173930+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:52.174059+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:53.174276+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:54.174484+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:55.174662+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:56.174824+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:57.175056+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:58.175257+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 ms_handle_reset con 0x55c73900b400 session 0x55c7389ade00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c737c7bc00
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:43:59.175468+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:00.175624+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:01.175780+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:02.175937+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:03.176198+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:04.176384+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:05.176644+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:06.176820+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:07.176975+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:08.177137+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:09.177418+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:10.177665+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:11.177898+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:12.178043+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:13.178234+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:14.178401+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:15.178853+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:16.179013+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:17.179200+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:18.179349+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:19.179504+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:20.179691+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:21.179998+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:22.180197+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:23.180398+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:24.180600+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:25.180781+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:26.180954+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:27.181108+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:28.181277+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:29.181514+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:30.181691+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:31.181854+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:32.182034+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:33.182165+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:34.182320+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:35.182520+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:36.182673+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:37.182820+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:38.182983+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:39.183121+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:40.183251+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:41.183422+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:42.183565+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:43.183708+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:44.183850+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:45.184063+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:46.184228+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:47.188369+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:48.188527+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:49.188683+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:50.188831+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:51.189118+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:52.189326+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:53.189520+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:54.189729+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:55.189974+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:56.190138+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:57.190344+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:58.190767+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:44:59.190998+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:00.191163+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:01.191374+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 ms_handle_reset con 0x55c73900ac00 session 0x55c738a9b2c0
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: handle_auth_request added challenge on 0x55c739000400
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:02.191653+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:03.191893+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:04.192170+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:05.192536+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:06.192760+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:07.193005+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:08.193366+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:09.193552+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:10.193703+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:11.193939+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:12.194206+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:13.194437+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:14.194639+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:15.194905+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:16.195132+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:17.195390+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:18.195563+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:19.195815+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:20.195957+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:21.196131+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:22.196313+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:23.196510+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:24.196740+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:25.197077+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:26.197236+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:27.197477+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:28.197708+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:29.197967+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:30.198181+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:31.198394+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:32.198559+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:33.198884+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:34.199065+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68755456 unmapped: 11935744 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:35.199247+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:36.199452+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:37.199647+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:38.199837+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:39.200062+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:40.200259+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:41.200501+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:42.200697+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:43.207098+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:44.207314+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:45.207565+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:46.207724+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:47.207932+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:48.208122+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:49.208352+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:50.208515+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:51.208659+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:52.208817+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:53.209014+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:54.209174+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:55.209404+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:56.209568+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:57.209689+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:58.209822+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:45:59.210054+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:00.210511+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:01.210813+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:02.210957+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:03.211355+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:04.211661+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:05.211868+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:06.212156+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:07.212569+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:08.212834+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:09.213011+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:10.213222+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:11.213503+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:12.213718+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:13.213991+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:14.214197+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:15.214500+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:16.214672+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:17.214818+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:18.214974+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:19.215245+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:20.215403+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:21.215682+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:22.215834+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:23.216039+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:24.216197+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:25.216484+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:26.216633+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:27.216788+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:28.217257+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:29.217476+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:30.217606+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:31.217836+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:32.218043+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:33.218362+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:34.218511+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:35.218724+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:36.218991+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:37.219191+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:38.219433+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:39.219656+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:40.219904+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:41.220058+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:42.220281+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:43.220579+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:44.220762+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:45.221006+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:46.221217+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:47.221431+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:48.221608+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:49.221831+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:50.221981+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:51.222228+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:52.222431+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:53.222616+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:54.222858+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:55.223143+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:56.223316+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:57.223490+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:58.223727+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:46:59.224078+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:00.224217+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:01.224365+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:02.224555+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:03.224722+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:04.224899+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:05.225132+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:06.225361+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:07.225762+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:08.226411+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:09.226942+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:10.227271+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:11.228136+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:12.228636+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:13.229092+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:14.229393+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:15.229642+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:16.230108+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:17.230562+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:18.230789+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:19.231012+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:20.231240+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:21.231366+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:22.231545+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:23.231667+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:24.231811+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:25.231977+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:26.232119+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:27.232314+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:28.232444+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:29.232605+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:30.232808+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:31.233069+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:32.233457+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:33.233629+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:34.233806+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:35.233989+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:36.234126+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fcb24000/0x0/0x4ffc00000, data 0x685ec/0xf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 11927552 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:37.234334+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config diff' '{prefix=config diff}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config show' '{prefix=config show}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68886528 unmapped: 11804672 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:38.234567+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 11788288 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 09:48:10 compute-0 ceph-osd[88047]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 09:48:10 compute-0 ceph-osd[88047]: bluestore.MempoolThread(0x55c736491b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 585257 data_alloc: 218103808 data_used: 266240
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: tick
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_tickets
Dec 01 09:48:10 compute-0 ceph-osd[88047]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:47:39.234697+0000)
Dec 01 09:48:10 compute-0 ceph-osd[88047]: prioritycache tune_memory target: 4294967296 mapped: 68837376 unmapped: 11853824 heap: 80691200 old mem: 2845415832 new mem: 2845415832
Dec 01 09:48:10 compute-0 ceph-osd[88047]: do_command 'log dump' '{prefix=log dump}'
Dec 01 09:48:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Dec 01 09:48:10 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3480816558' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 09:48:10 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #54. Immutable memtables: 0.
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.760699) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 54
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582490760948, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1230, "num_deletes": 507, "total_data_size": 869562, "memory_usage": 892264, "flush_reason": "Manual Compaction"}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #55: started
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582490768931, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 55, "file_size": 846467, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22611, "largest_seqno": 23840, "table_properties": {"data_size": 840924, "index_size": 2363, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16044, "raw_average_key_size": 19, "raw_value_size": 827451, "raw_average_value_size": 999, "num_data_blocks": 105, "num_entries": 828, "num_filter_entries": 828, "num_deletions": 507, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582417, "oldest_key_time": 1764582417, "file_creation_time": 1764582490, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 55, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 8072 microseconds, and 3292 cpu microseconds.
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.768968) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #55: 846467 bytes OK
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.768989) [db/memtable_list.cc:519] [default] Level-0 commit table #55 started
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.770458) [db/memtable_list.cc:722] [default] Level-0 commit table #55: memtable #1 done
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.770473) EVENT_LOG_v1 {"time_micros": 1764582490770468, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.770490) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 862677, prev total WAL file size 862677, number of live WAL files 2.
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000051.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.771282) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353030' seq:72057594037927935, type:22 .. '6C6F676D00373533' seq:0, type:0; will stop at (end)
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [55(826KB)], [53(6606KB)]
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582490771339, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [55], "files_L6": [53], "score": -1, "input_data_size": 7611111, "oldest_snapshot_seqno": -1}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #56: 4224 keys, 5447538 bytes, temperature: kUnknown
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582490801214, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 56, "file_size": 5447538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5419188, "index_size": 16720, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10565, "raw_key_size": 103906, "raw_average_key_size": 24, "raw_value_size": 5343064, "raw_average_value_size": 1264, "num_data_blocks": 705, "num_entries": 4224, "num_filter_entries": 4224, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764580340, "oldest_key_time": 0, "file_creation_time": 1764582490, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45d3ecca-3e60-40df-8d21-b0b3630e7b99", "db_session_id": "2DUIFG3VBWNEITLEK8RC", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.801657) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 5447538 bytes
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.803000) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 252.4 rd, 180.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 6.5 +0.0 blob) out(5.2 +0.0 blob), read-write-amplify(15.4) write-amplify(6.4) OK, records in: 5251, records dropped: 1027 output_compression: NoCompression
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.803027) EVENT_LOG_v1 {"time_micros": 1764582490803016, "job": 28, "event": "compaction_finished", "compaction_time_micros": 30160, "compaction_time_cpu_micros": 14347, "output_level": 6, "num_output_files": 1, "total_output_size": 5447538, "num_input_records": 5251, "num_output_records": 4224, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000055.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582490803502, "job": 28, "event": "table_file_deletion", "file_number": 55}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582490804941, "job": 28, "event": "table_file_deletion", "file_number": 53}
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.771186) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.805065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.805073) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.805078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.805082) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:48:10 compute-0 ceph-mon[75031]: rocksdb: (Original Log Time 2025/12/01-09:48:10.805086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:48:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df"} v 0) v1
Dec 01 09:48:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1839763276' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 09:48:11 compute-0 ceph-mon[75031]: from='client.15168 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:11 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3480816558' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 09:48:11 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1839763276' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 09:48:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs dump"} v 0) v1
Dec 01 09:48:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3958724300' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 09:48:11 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Dec 01 09:48:11 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1926194822' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 09:48:11 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1130: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:12 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15178 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:12 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/3958724300' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 09:48:12 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1926194822' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 09:48:12 compute-0 ceph-mon[75031]: pgmap v1130: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:12 compute-0 systemd[1]: Starting Hostname Service...
Dec 01 09:48:12 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds stat"} v 0) v1
Dec 01 09:48:12 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1389647452' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 09:48:12 compute-0 systemd[1]: Started Hostname Service.
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Optimize plan auto_2025-12-01_09:48:13
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [balancer INFO root] do_upmap
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'cephfs.cephfs.data', 'volumes', '.mgr', 'vms', 'images', 'backups']
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [balancer INFO root] prepared 0/10 changes
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] scanning for idle connections..
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [volumes INFO mgr_util] cleaning up connections: []
Dec 01 09:48:13 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump"} v 0) v1
Dec 01 09:48:13 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4114210167' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 09:48:13 compute-0 ceph-mon[75031]: from='client.15178 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:13 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1389647452' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 09:48:13 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4114210167' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15184 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:13 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1131: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:14 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd blocklist ls"} v 0) v1
Dec 01 09:48:14 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1507525120' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 09:48:14 compute-0 ceph-mon[75031]: from='client.15184 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:14 compute-0 ceph-mon[75031]: pgmap v1131: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:14 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1507525120' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 09:48:14 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15188 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:14 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15190 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd dump"} v 0) v1
Dec 01 09:48:15 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4246584534' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 09:48:15 compute-0 ceph-mon[75031]: from='client.15188 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:15 compute-0 ceph-mon[75031]: from='client.15190 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:15 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/4246584534' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 09:48:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd numa-status"} v 0) v1
Dec 01 09:48:15 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1803597125' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 09:48:15 compute-0 ceph-mon[75031]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:48:15 compute-0 ceph-mgr[75324]: log_channel(cluster) log [DBG] : pgmap v1132: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15196 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:16 compute-0 ceph-mon[75031]: from='client.? 192.168.122.100:0/1803597125' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 09:48:16 compute-0 ceph-mon[75031]: pgmap v1132: 193 pgs: 193 active+clean; 41 MiB data, 188 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:48:16 compute-0 ceph-mon[75031]: from='client.15196 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: log_channel(audit) log [DBG] : from='client.15198 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 6.359070782053786e-08 of space, bias 1.0, pg target 1.907721234616136e-05 quantized to 32 (current 32)
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126644201341 of space, bias 1.0, pg target 0.19983379932604023 quantized to 32 (current 32)
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 5.087256625643029e-07 of space, bias 4.0, pg target 0.0006104707950771635 quantized to 16 (current 32)
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 01 09:48:16 compute-0 ceph-mgr[75324]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 01 09:48:16 compute-0 ceph-mon[75031]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0) v1
Dec 01 09:48:16 compute-0 ceph-mon[75031]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1944196512' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
